Skip to content

Commit

Permalink
Changed release version to be 0.4.0 (#271)
Browse files Browse the repository at this point in the history
* wip

* wip

* corrected version information

* wip

* wip

* updated build instructions
  • Loading branch information
ronanstokes-db committed Jun 7, 2024
1 parent 4206b5c commit aae8bde
Show file tree
Hide file tree
Showing 9 changed files with 17 additions and 11 deletions.
3 changes: 2 additions & 1 deletion CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,9 +3,10 @@
## Change History
All notable changes to the Databricks Labs Data Generator will be documented in this file.

### Unreleased
### Version 0.4.0

#### Changed
* Updated minimum pyspark version to be 3.2.1, compatible with Databricks runtime 10.4 LTS or later
* Modified data generator to allow specification of constraints to the data generation process
* Updated documentation for generating text data.
* Modified data distribiutions to use abstract base classes
Expand Down
9 changes: 7 additions & 2 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ runtime 9.1 LTS or later.

## Checking your code for common issues

Run `./lint.sh` from the project root directory to run various code style checks.
Run `make dev-lint` from the project root directory to run various code style checks.
These are based on the use of `prospector`, `pylint` and related tools.

## Setting up your build environment
Expand All @@ -45,6 +45,11 @@ Our recommended mechanism for building the code is to use a `conda` or `pipenv`

But it can be built with any Python virtualization environment.

### Spark dependencies
The builds have been tested against Spark 3.2.1. This requires the OpenJDK 1.8.56 or later version of Java 8.
The Databricks runtimes use the Azul Zulu version of OpenJDK 8 and we have used these in local testing.
These are not installed automatically by the build process, so you will need to install them separately.

### Building with Conda
To build with `conda`, perform the following commands:
- `make create-dev-env` from the main project directory to create your conda environment, if using
Expand All @@ -70,7 +75,7 @@ To build with `pipenv`, perform the following commands:
- Run `make dist` from the main project directory
- The resulting wheel file will be placed in the `dist` subdirectory

The resulting build has been tested against Spark 3.0.1
The resulting build has been tested against Spark 3.2.1

## Creating the HTML documentation

Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -65,7 +65,7 @@ details of use and many examples.

Release notes and details of the latest changes for this specific release
can be found in the GitHub repository
[here](https://github.com/databrickslabs/dbldatagen/blob/release/v0.3.6post1/CHANGELOG.md)
[here](https://github.com/databrickslabs/dbldatagen/blob/release/v0.4.0/CHANGELOG.md)

# Installation

Expand Down
2 changes: 1 addition & 1 deletion dbldatagen/_version.py
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ def get_version(version):
return version_info


__version__ = "0.3.6post1" # DO NOT EDIT THIS DIRECTLY! It is managed by bumpversion
__version__ = "0.4.0" # DO NOT EDIT THIS DIRECTLY! It is managed by bumpversion
__version_info__ = get_version(__version__)


Expand Down
2 changes: 1 addition & 1 deletion docs/source/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@
author = 'Databricks Inc'

# The full version, including alpha/beta/rc tags
release = "0.3.6post1" # DO NOT EDIT THIS DIRECTLY! It is managed by bumpversion
release = "0.4.0" # DO NOT EDIT THIS DIRECTLY! It is managed by bumpversion

# -- General configuration ---------------------------------------------------

Expand Down
2 changes: 1 addition & 1 deletion python/.bumpversion.cfg
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
[bumpversion]
current_version = 0.3.6post1
current_version = 0.4.0
commit = False
tag = False
parse = (?P<major>\d+)\.(?P<minor>\d+)\.(?P<patch>\d+){0,1}(?P<release>\D*)(?P<build>\d*)
Expand Down
4 changes: 2 additions & 2 deletions python/dev_require.txt
Original file line number Diff line number Diff line change
Expand Up @@ -3,9 +3,9 @@
numpy==1.22.0
pandas==1.2.4
pickleshare==0.7.5
py4j==0.10.9
py4j>=0.10.9.3
pyarrow==4.0.1
pyspark>=3.1.3
pyspark>=3.2.1,<=3.3.0
python-dateutil==2.8.1
six==1.15.0
pyparsing==2.4.7
Expand Down
2 changes: 1 addition & 1 deletion python/require.txt
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ pandas==1.2.5
pickleshare==0.7.5
py4j==0.10.9
pyarrow==4.0.1
pyspark>=3.1.3
pyspark>=3.2.1
python-dateutil==2.8.1
six==1.15.0
pyparsing==2.4.7
Expand Down
2 changes: 1 addition & 1 deletion setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@

setuptools.setup(
name="dbldatagen",
version="0.3.6post1",
version="0.4.0",
author="Ronan Stokes, Databricks",
description="Databricks Labs - PySpark Synthetic Data Generator",
long_description=long_description,
Expand Down

0 comments on commit aae8bde

Please sign in to comment.