1. [Create a pyenv](Pyenv%20usage.md) with the right Python version, where right depends on the Python version that you can see in your Databricks clusters' Web Terminal (found under Cluster -> Apps).
2. Create a new PyCharm project with a clean virtualenv using the Python version just established.
3. Import the core Databricks projects:
```shell
pip install databricks-cli dbx pyspark
```
5. Set up the git repository: `git init`
6. Configure dbx:
```shell
dbx configure --profile DEFAULT --environment default
```
7. Configure your deployment strategy with the content in the next point:
```shell
cat > conf/deployment.yaml
```
8. Content of:
deployment.yaml
```yaml
build:
no_build: true
environments:
default:
workflows:
- name: "dbx-demo-job"
spark_python_task:
python_file: "file://my-demo-job.py"
```