To get the prerequisites on a a fresh Ubuntu LTS 16.04 run:
- [sudo] apt-get install python3-dev python3-pip python3-psycopg2 python3-tidylib phpunit
+ [sudo] apt-get install python3-dev python3-pip python3-psycopg2 python3-tidylib phpunit php-cgi
pip3 install --user behave nose
This test directory is sturctured as follows:
+```
-+- bdd Functional API tests
| \
| +- steps Step implementations for test descriptions
+- php PHP unit tests
+- scenes Geometry test data
+- testdb Base data for generating API test database
-
+```
PHP Unit Tests
==============
To execute the test suite run
- cd test/php
- phpunit ../
+ cd test/php
+ phpunit ../
It will read phpunit.xml which points to the library, test path, bootstrap
strip and set other parameters.
The tests can be configured with a set of environment variables:
- * `BUILD_DIR` - build directory of Nominatim installation to test
+ * `BUILDDIR` - build directory of Nominatim installation to test
* `TEMPLATE_DB` - name of template database used as a skeleton for
the test databases (db tests)
* `TEST_DB` - name of test database (db tests)
- * `ABI_TEST_DB` - name of the database containing the API test data (api tests)
+ * `API_TEST_DB` - name of the database containing the API test data (api tests)
+ * `DB_HOST` - (optional) hostname of database host
+ * `DB_USER` - (optional) username of database login
+ * `DB_PASS` - (optional) password for database login
+ * `SERVER_MODULE_PATH` - (optional) path on the Postgres server to Nominatim
+ module shared library file
* `TEST_SETTINGS_TEMPLATE` - file to write temporary Nominatim settings to
* `REMOVE_TEMPLATE` - if true, the template database will not be reused during
the next run. Reusing the base templates speeds up tests
These tests are meant to test the different API endpoints and their parameters.
They require a preimported test database, which consists of the import of a
-planet extract. The polygons defining the extract can be found in the test/testdb
+planet extract. A precompiled PBF with the necessary data can be downloaded from
+https://www.nominatim.org/data/test/nominatim-api-testdata.pbf
+
+The polygons defining the extract can be found in the test/testdb
directory. There is also a reduced set of wikipedia data for this extract,
-which you need to import as well.
+which you need to import as well. For Tiger tests the data of South Dakota
+is required. Get the Tiger files `46*`.
The official test dataset is derived from the 160725 planet. Newer
planets are likely to work as well but you may see isolated test
failures where the data has changed. To recreate the input data
for the test database run:
- wget http://free.nchc.org.tw/osm.planet/pbf/planet-160725.osm.pbf
- osmconvert planet-160725.osm.pbf -B=test/testdb/testdb.polys -o=testdb.pbf
+ wget https://ftp5.gwdg.de/pub/misc/openstreetmap/planet.openstreetmap.org/pbf/planet-180924.osm.pbf
+ osmconvert planet-180924.osm.pbf -B=test/testdb/testdb.polys -o=testdb.pbf
Before importing make sure to add the following to your local settings:
@define('CONST_Database_DSN', 'pgsql://@/test_api_nominatim');
@define('CONST_Wikipedia_Data_Path', CONST_BasePath.'/test/testdb');
+#### Code Coverage
+
+The API tests also support code coverage tests. You need to install
+[PHP_CodeCoverage](https://github.com/sebastianbergmann/php-code-coverage).
+On Debian/Ubuntu run:
+
+ apt-get install php-codecoverage php-xdebug
+
+The run the API tests as follows:
+
+ behave api -DPHPCOV=<coverage output dir>
+
+The output directory must be an absolute path. To generate reports, you can use
+the [phpcov](https://github.com/sebastianbergmann/phpcov) tool:
+
+ phpcov merge --html=<report output dir> <coverage output dir>
+
### Indexing Tests (`test/bdd/db`)
These tests check the import and update of the Nominatim database. They do not