Here’s some miscellaneous documentation about using Calcite and its various adapters.
- Building from a source distribution
- Building from Git
- Gradle vs Gradle wrapper
- Running tests
- Running integration tests
- Getting started
- Setting up an IDE for contributing
- Debugging generated classes in Intellij
- CSV adapter
- MongoDB adapter
- Splunk adapter
- Implementing an adapter
- Advanced topics for developers
- Advanced topics for committers
- Managing Calcite repositories through GitHub
- Merging pull requests
- Set up PGP signing keys
- Set up Nexus repository credentials
- Making a snapshot
- Making a release candidate
- Cleaning up after a failed release attempt
- Validate a release
- Get approval for a release via Apache voting process
- Publishing a release
- Publishing the web site
Building from a source distribution
Prerequisite is Java (JDK 8, 9, 10, 11, 12, 13, 14 or 15) on your path.
Unpack the source distribution
cd to the root directory of the unpacked source,
then build using the included maven wrapper:
Running tests describes how to run more or fewer tests.
Building from Git
Prerequisites are git and Java (JDK 8, 9, 10, 11, 12, 13, 14 or 15) on your path.
Create a local copy of the github repository,
cd to its root directory,
then build using the included maven wrapper:
Calcite includes a number of machine-generated codes. By default, these are regenerated on every build, but this has the negative side-effect of causing a re-compilation of the entire project when the non-machine-generated code has not changed.
Typically re-generation is called automatically when the relevant templates
are changed, and it should work transparently.
However if your IDE does not generate sources (e.g.
then you can call
./gradlew generateSources tasks manually.
Running tests describes how to run more or fewer tests.
Gradle vs Gradle wrapper
Calcite uses Gradle wrapper to make a consistent build environment.
In the typical case you don’t need to install Gradle manually, and
./gradlew would download the proper version for you and verify the expected checksum.
You can install Gradle manually, however please note that there might be impedance mismatch between different versions.
The test suite will run by default when you build, unless you specify
You can use
./gradlew assemble to build the artifacts and skip all tests and verifications.
There are other options that control which tests are run, and in what environment, as follows.
-Dcalcite.test.db=DB(where db is
postgresql) allows you to change the JDBC data source for the test suite. Calcite’s test suite requires a JDBC data source populated with the foodmart data set.
hsqldb, the default, uses an in-memory hsqldb database.
- All others access a test virtual machine
(see integration tests below).
postgresqlmight be somewhat faster than hsqldb, but you need to populate it (i.e. provision a VM).
-Dcalcite.debugprints extra debugging information to stdout.
-Dcalcite.test.splunkenables tests that run against Splunk. Splunk must be installed and running.
./gradlew testSlowruns tests that take longer to execute. For example, there are tests that create virtual TPC-H and TPC-DS schemas in-memory and run tests from those benchmarks.
Note: tests are executed in a forked JVM, so system properties are not passed automatically
when running tests with Gradle.
By default, the build script passes the following
calcite.test.dband others above)
Running integration tests
For testing Calcite’s external adapters, a test virtual machine should be used. The VM includes Cassandra, Druid, H2, HSQLDB, MySQL, MongoDB, and PostgreSQL.
Test VM requires 5GiB of disk space and it takes 30 minutes to build.
Note: you can use calcite-test-dataset to populate your own database, however it is recommended to use test VM so the test environment can be reproduced.
1) Clone https://github.com/vlsi/calcite-test-dataset.git at the same level as calcite repository. For instance:
Note: integration tests search for ../calcite-test-dataset or ../../calcite-test-dataset. You can specify full path via calcite.test.dataset system property.
2) Build and start the VM:
Test VM is provisioned by Vagrant, so regular Vagrant
vagrant up and
vagrant halt should be used to start and stop the VM.
The connection strings for different databases are listed in calcite-test-dataset readme.
Suggested test flow
Note: test VM should be started before you launch integration tests. Calcite itself does not start/stop the VM.
- Executing regular unit tests (does not require external data): no change.
- Executing all tests, for all the DBs:
./gradlew test integTestAll.
- Executing just tests for external DBs, excluding unit tests:
- Executing PostgreSQL JDBC tests:
- Executing just MongoDB tests:
From within IDE:
- Executing regular unit tests: no change.
- Executing MongoDB tests: run
- Executing MySQL tests: run
- Executing PostgreSQL tests: run
Integration tests technical details
Tests with external data are executed at maven’s integration-test phase.
We do not currently use pre-integration-test/post-integration-test, however we could use that in future.
The verification of build pass/failure is performed at verify phase.
Integration tests should be named
...IT.java, so they are not picked up on unit test execution.
See the developers guide.
See the developers guide.
Setting up an IDE for contributing
Setting up IntelliJ IDEA
Download a version of IntelliJ IDEA greater than (2018.X). Versions 2019.2, and 2019.3 have been tested by members of the community and appear to be stable. Older versions of IDEA may still work without problems for Calcite sources that do not use the Gradle build (release 1.21.0 and before).
Follow the standard steps for the installation of IDEA and set up one of the JDK versions currently supported by Calcite.
Start with building Calcite from the command line.
Go to File > Open… and open up Calcite’s root
When IntelliJ asks if you want to open it as a project or a file, select project.
Also, say yes when it asks if you want a new window.
IntelliJ’s Gradle project importer should handle the rest.
There is a partially implemented IntelliJ code style configuration that you can import located on GitHub. It does not do everything needed to make Calcite’s style checker happy, but it does a decent amount of it. To import, go to Preferences > Editor > Code Style, click the gear next to “scheme”, then Import Scheme > IntelliJ IDEA Code Style XML.
Once the importer is finished, test the project setup.
For example, navigate to the method
Navigate > Symbol and enter
testWinAgg by right-clicking and selecting Run (or the equivalent keyboard shortcut).
Setting up NetBeans
From the main menu, select File > Open Project and navigate to a name of the project (Calcite) with a small Gradle icon, and choose to open. Wait for NetBeans to finish importing all dependencies.
To ensure that the project is configured successfully, navigate to the method
Right-click on the method and select to Run Focused Test Method.
NetBeans will run a Maven process, and you should see in the command output window a line with
Running org.apache.calcite.test.JdbcTest followed by
Note: it is not clear if NetBeans automatically generates relevant sources on project import,
so you might need to run
./gradlew generateSources before importing the project (and when you
update template parser sources, and project version)
To enable tracing, add the following flags to the java command line:
The first flag causes Calcite to print the Java code it generates (to execute queries) to stdout. It is especially useful if you are debugging mysterious problems like this:
Exception in thread "main" java.lang.ClassCastException: Integer cannot be cast to Long
at Baz$1$1.current(Unknown Source)
By default, Calcite uses the Log4j bindings for SLF4J. There is a provided configuration
file which outputs logging at the INFO level to the console in
You can modify the level for the rootLogger to increase verbosity or change the level
for a specific class if you so choose.
Debugging generated classes in Intellij
To debug generated classes, set two system properties when starting the JVM:
-Dorg.codehaus.janino.source_debugging.dir=C:\tmp(This property is optional; if not set, Janino will create temporary files in the system’s default location for temporary files, such as
/tmpon Unix-based systems.)
After code is generated, either go into Intellij and mark the folder that
contains generated temporary files as a generated sources root or sources root,
or directly set the value of
org.codehaus.janino.source_debugging.dir to an
existing source root when starting the JVM.
See the tutorial.
First, download and install Calcite, and install MongoDB.
Note: you can use MongoDB from integration test virtual machine above.
Import MongoDB’s zipcode data set into MongoDB:
Log into MongoDB to check it’s there:
Connect using the mongo-model.json Calcite model:
To run the test suite and sample queries against Splunk,
tutorialdata.zip data set as described in
the Splunk tutorial.
(This step is optional, but it provides some interesting data for the sample
queries. It is also necessary if you intend to run the test suite, using
Implementing an adapter
New adapters can be created by implementing
Testing adapter in Java
The example below shows how SQL query can be submitted to
CalcitePrepare with a custom context (
AdapterContext in this
case). Calcite prepares and implements the query execution, using the
resources provided by the
provides access to the underlying enumerable and methods for
enumeration. The enumerable itself can naturally be some adapter
Advanced topics for developers
The following sections might be of interest if you are adding features to particular parts of the code base. You don’t need to understand these topics if you are just building from source and running tests.
When Calcite compares types (instances of
RelDataType), it requires them to be the same
object. If there are two distinct type instances that refer to the
same Java type, Calcite may fail to recognize that they match. It is
- Use a single instance of
JavaTypeFactorywithin the calcite context;
- Store the types so that the same object is always returned for the same type.
Rebuilding generated Protocol Buffer code
Calcite’s Avatica Server component supports RPC serialization using Protocol Buffers. In the context of Avatica, Protocol Buffers can generate a collection of messages defined by a schema. The library itself can parse old serialized messages using a new schema. This is highly desirable in an environment where the client and server are not guaranteed to have the same version of objects.
Typically, the code generated by the Protocol Buffers library doesn’t need to be re-generated only every build, only when the schema changes.
First, install Protobuf 3.0:
Then, re-generate the compiled code:
Create a planner rule
Create a class that extends
RelRule (or occasionally a sub-class).
The class name should indicate the basic RelNode types that are matched,
sometimes followed by what the rule does, then the word
The rule must have a constructor that takes a
Config as argument.
It should be
protected, and will only be called from
The class must contain an interface called
Config that extends
RelRule.Config (or the config of a the rule’s super-class).
Config must implement the
toRule method and create a rule.
Config must have a member called
DEFAULT that creates a typical
configuration. At a minimum, it must call
withOperandSupplier to create
a typical tree of operands.
The rule should not have a static
There should be an instance of the rule in a holder class such as
The holder class may contain other instances of the rule with different parameters, if they are commonly used.
If the rule is instantiated with several patterns of operands
(for instance, with different sub-classes of the same base RelNode classes,
or with different predicates) the config may contain a method
to make it easier to build common operand patterns.
FilterAggregateTransposeRule for an example.)
Advanced topics for committers
The following sections are of interest to Calcite committers and in particular release managers.
Managing Calcite repositories through GitHub
Committers have write access to Calcite’s ASF git repositories hosting the source code of the project as well as the website.
All repositories present on GitBox are available on GitHub with write-access enabled, including rights to open/close/merge pull requests and address issues.
In order to exploit the GitHub services, committers should link their ASF and GitHub accounts via the account linking page.
Here are the steps:
- Set your GitHub username into your Apache profile.
- Enable GitHub 2FA on your GitHub account.
- Activating GitHub 2FA changes the authentication process and may affect the way you access GitHub. You may need to establish personal access tokens or upload your public SSH key to GitHub depending on the protocol that you are using (HTTPS vs. SSH).
- Merge your Apache and GitHub accounts using the account linking page (you should see 3 green checks in GitBox).
- Wait at least 30 minutes for an email inviting you to Apache GitHub Organization.
- Accept the invitation and verify that you are a member of the team.
Merging pull requests
These are instructions for a Calcite committer who has reviewed a pull request from a contributor, found it satisfactory, and is about to merge it to master. Usually the contributor is not a committer (otherwise they would be committing it themselves, after you gave approval in a review).
There are certain kinds of continuous integration tests that are not run
automatically against the PR. These tests can be triggered explicitly by adding
an appropriate label to the PR. For instance, you can run slow tests by adding
slow-tests-needed label. It is up to you to decide if these additional
tests need to run before merging.
If the PR has multiple commits, squash them into a single commit. The commit message should follow the conventions outined in contribution guidelines. If there are conflicts it is better to ask the contributor to take this step, otherwise it is preferred to do this manually since it saves time and also avoids unnecessary notification messages to many people on GitHub.
If the contributor is not a committer, add their name in parentheses at the end of the first line of the commit message.
If the merge is performed via command line (not through the GitHub web interface), make sure the message contains a line “Close apache/calcite#YYY”, where YYY is the GitHub pull request identifier.
When the PR has been merged and pushed, be sure to update the JIRA case. You must:
- resolve the issue (do not close it as this will be done by the release manager);
- select “Fixed” as resolution cause;
- mark the appropriate version (e.g., 1.26.0) in the “Fix version” field;
- add a comment (e.g., “Fixed in …”) with a hyperlink pointing to the commit which resolves the issue (in GitHub or GitBox), and also thank the contributor for their contribution.
Set up PGP signing keys
Follow instructions here to
create a key pair. (On macOS, I did
brew install gpg and
Add your public key to the
file by following instructions in the
KEYS file. If you don’t have
the permission to update the
KEYS file, ask PMC for help.
KEYS file is not present in the git repo or in a release tar
ball because that would be
Set up Nexus repository credentials
Gradle provides multiple ways to configure project properties.
For instance, you could update
Note: the build script would print the missing properties, so you can try running it and let it complain on the missing ones.
The following options are used:
asfSvnUsername are your apache id with
asfSvnPassword are corresponding password.
Note: when https://github.com/vlsi/asflike-release-environment is used, the credentials are taken from
asfGitSourceUsername is your github id while
asfGitSourcePassword is not your github password.
You need to generate it in https://github.com/settings/tokens choosing
Personal access tokens.
Note: if you want to uses
gpg-agent, you need to pass anther properties:
Making a snapshot
Before you start:
- Make sure you are using JDK 8.
- Make sure build and tests succeed with
Making a release candidate
Note: release artifacts (dist.apache.org and repository.apache.org) are managed with stage-vote-release-plugin
Before you start:
- Set up signing keys as described above.
- Make sure you are using JDK 8 (not 9 or 10).
- Check that
site/_docs/howto.mdhave the correct version number.
- Check that
NOTICEhas the current copyright year.
- Check that
calcite.versionhas the proper value in
- Make sure build and tests succeed
- Make sure that
./gradlew javadocsucceeds (i.e. gives no errors; warnings are OK)
- Generate a report of vulnerabilities that occur among dependencies,
./gradlew dependencyCheckUpdate dependencyCheckAggregate. Report to firstname.lastname@example.org if new critical vulnerabilities are found among dependencies.
- Decide the supported configurations of JDK, operating system and
Guava. These will probably be the same as those described in the
release notes of the previous release. Document them in the release
notes. To test Guava version x.y, specify
- Optional tests using properties:
- Optional tests using tasks:
- Trigger a
by merging the latest code into the
julianhyde/coverity_scanbranch, and when it completes, make sure that there are no important issues.
- Add release notes to
site/_docs/history.md. Include the commit history, and say which versions of Java, Guava and operating systems the release is tested against.
- Make sure that every “resolved” JIRA case (including duplicates) has a fix version assigned (most likely the version we are just about to release)
sqlline with Spatial and Oracle function tables:
The release candidate process does not add commits,
so there’s no harm if it fails. It might leave
-rc tag behind
which can be removed if required.
You can perform a dry-run release with a help of https://github.com/vlsi/asflike-release-environment That would perform the same steps, however it would push changes to the mock Nexus, Git, and SVN servers.
If any of the steps fail, fix the problem, and start again from the top.
To prepare a release candidate directly in your environment:
Pick a release candidate index and ensure it does not interfere with previous candidates for the version.
Checking the artifacts
- In the
release/build/distributionsdirectory should be these 3 files, among others:
- Note that the file names start
- In the source distro
.tar.gz(currently there is no binary distro), check that all files belong to a directory called
- That directory must contain files
- Check that the version in
- Check that the copyright year in
- Check that the version in
- Make sure that there is no
KEYSfile in the source distros
- In each .jar (for example
mongodb/build/libs/calcite-mongodb-X.Y.Z-sources.jar), check that the
- Check PGP, per this
Verify the staged artifacts in the Nexus repository:
- Go to https://repository.apache.org/ and login
Build Promotion, click
- In the
Staging Repositoriestab there should be a line with profile
- Navigate through the artifact tree and make sure the .jar, .pom, .asc files are present
- Check the box on in the first column of the row, and press the ‘Close’ button to publish the repository at https://repository.apache.org/content/repositories/orgapachecalcite-1000 (or a similar URL)
Cleaning up after a failed release attempt
If something is not correct, you can fix it, commit it, and prepare the next candidate. The release candidate tags might be kept for a while.
Validate a release
Get approval for a release via Apache voting process
Release vote on dev list
Note: the draft mail is printed as the final step of
and you can find the draft in
After vote finishes, send out the result:
Publishing a release
After a successful release vote, we need to push the release out to mirrors, and other tasks.
Choose a release date. This is based on the time when you expect to announce the release. This is usually a day after the vote closes. Remember that UTC date changes at 4pm Pacific time.
Publishing directly in your environment:
If there are now more than 2 releases, clear out the oldest ones:
The old releases will remain available in the release archive.
You should receive an email from the Apache Reporter Service. Make sure to add the version number and date of the latest release at the site linked to in the email.
Update the site with the release note, the release announcement, and the javadoc of the new version.
The javadoc can be generated only from a final version (not a SNAPSHOT) so checkout the most recent
tag and start working there (
git checkout calcite-X.Y.Z). Add a release announcement by copying
Generate the javadoc, and preview the site by following the
instructions in site/README.md. Check that the announcement,
javadoc, and release note appear correctly and then publish the site following the instructions
in the same file. Now checkout again the release branch (
git checkout branch-X.Y) and commit
the release announcement.
Merge the release branch back into
git merge --ff-only branch-X.Y) and align
master with the
site branch (e.g.,
git merge --ff-only site).
In JIRA, search for
all issues resolved in this release,
and do a bulk update(choose the
transition issues option) changing their status to “Closed”,
with a change comment
“Resolved in release X.Y.Z (YYYY-MM-DD)”
(fill in release number and date appropriately).
Uncheck “Send mail for this update”. Under the releases tab
of the Calcite project mark the release X.Y.Z as released. If it does not already exist create also
a new version (e.g., X.Y+1.Z) for the next release.
After 24 hours, announce the release by sending an email to
email@example.com using an
address. You can use
the 1.20.0 announcement
as a template. Be sure to include a brief description of the project.
Publishing the web site
See instructions in site/README.md.