✨(project) first proof of concept printing pdf from markdown
This is a boilerplate inspired from https://github.com/openfun/joanie
This commit is contained in:
33
.dockerignore
Normal file
33
.dockerignore
Normal file
@@ -0,0 +1,33 @@
|
|||||||
|
# Python
|
||||||
|
__pycache__
|
||||||
|
*.pyc
|
||||||
|
**/__pycache__
|
||||||
|
**/*.pyc
|
||||||
|
venv
|
||||||
|
.venv
|
||||||
|
|
||||||
|
# System-specific files
|
||||||
|
.DS_Store
|
||||||
|
**/.DS_Store
|
||||||
|
|
||||||
|
# Docker
|
||||||
|
docker compose.*
|
||||||
|
env.d
|
||||||
|
|
||||||
|
# Docs
|
||||||
|
docs
|
||||||
|
*.md
|
||||||
|
*.log
|
||||||
|
|
||||||
|
# Development/test cache & configurations
|
||||||
|
data
|
||||||
|
.cache
|
||||||
|
.circleci
|
||||||
|
.git
|
||||||
|
.vscode
|
||||||
|
.iml
|
||||||
|
.idea
|
||||||
|
db.sqlite3
|
||||||
|
.mypy_cache
|
||||||
|
.pylint.d
|
||||||
|
.pytest_cache
|
||||||
6
.github/ISSUE_TEMPLATE.md
vendored
Normal file
6
.github/ISSUE_TEMPLATE.md
vendored
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
<!---
|
||||||
|
Thanks for filing an issue 😄 ! Before you submit, please read the following:
|
||||||
|
|
||||||
|
Check the other issue templates if you are trying to submit a bug report, feature request, or question
|
||||||
|
Search open/closed issues before submitting since someone might have asked the same thing before!
|
||||||
|
-->
|
||||||
28
.github/ISSUE_TEMPLATE/Bug_report.md
vendored
Normal file
28
.github/ISSUE_TEMPLATE/Bug_report.md
vendored
Normal file
@@ -0,0 +1,28 @@
|
|||||||
|
---
|
||||||
|
name: 🐛 Bug Report
|
||||||
|
about: If something is not working as expected 🤔.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Bug Report
|
||||||
|
|
||||||
|
**Problematic behavior**
|
||||||
|
A clear and concise description of the behavior.
|
||||||
|
|
||||||
|
**Expected behavior/code**
|
||||||
|
A clear and concise description of what you expected to happen (or code).
|
||||||
|
|
||||||
|
**Steps to Reproduce**
|
||||||
|
1. Do this...
|
||||||
|
2. Then this...
|
||||||
|
3. And then the bug happens!
|
||||||
|
|
||||||
|
**Environment**
|
||||||
|
- publish version:
|
||||||
|
- Platform:
|
||||||
|
|
||||||
|
**Possible Solution**
|
||||||
|
<!--- Only if you have suggestions on a fix for the bug -->
|
||||||
|
|
||||||
|
**Additional context/Screenshots**
|
||||||
|
Add any other context about the problem here. If applicable, add screenshots to help explain.
|
||||||
23
.github/ISSUE_TEMPLATE/Feature_request.md
vendored
Normal file
23
.github/ISSUE_TEMPLATE/Feature_request.md
vendored
Normal file
@@ -0,0 +1,23 @@
|
|||||||
|
---
|
||||||
|
name: ✨ Feature Request
|
||||||
|
about: I have a suggestion (and may want to build it 💪)!
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Feature Request
|
||||||
|
|
||||||
|
**Is your feature request related to a problem or unsupported use case? Please describe.**
|
||||||
|
A clear and concise description of what the problem is. For example: I need to do some task and I have an issue...
|
||||||
|
|
||||||
|
**Describe the solution you'd like**
|
||||||
|
A clear and concise description of what you want to happen. Add any considered drawbacks.
|
||||||
|
|
||||||
|
**Describe alternatives you've considered**
|
||||||
|
A clear and concise description of any alternative solutions or features you've considered.
|
||||||
|
|
||||||
|
**Discovery, Documentation, Adoption, Migration Strategy**
|
||||||
|
If you can, explain how users will be able to use this and possibly write out a version the docs (if applicable).
|
||||||
|
Maybe a screenshot or design?
|
||||||
|
|
||||||
|
**Do you want to work on it through a Pull Request?**
|
||||||
|
<!-- Make sure to coordinate with us before you spend too much time working on an implementation! -->
|
||||||
22
.github/ISSUE_TEMPLATE/Support_question.md
vendored
Normal file
22
.github/ISSUE_TEMPLATE/Support_question.md
vendored
Normal file
@@ -0,0 +1,22 @@
|
|||||||
|
---
|
||||||
|
name: 🤗 Support Question
|
||||||
|
about: If you have a question 💬, or something was not clear from the docs!
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
<!-- ^ Click "Preview" for a nicer view! ^
|
||||||
|
We primarily use GitHub as an issue tracker. If however you're encountering an issue not covered in the docs, we may be able to help! -->
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
Please make sure you have read our [main Readme](https://github.com/numerique-gouv/publish).
|
||||||
|
|
||||||
|
Also make sure it was not already answered in [an open or close issue](https://github.com/numerique-gouv/publish/issues).
|
||||||
|
|
||||||
|
If your question was not covered, and you feel like it should be, fire away! We'd love to improve our docs! 👌
|
||||||
|
|
||||||
|
**Topic**
|
||||||
|
What's the general area of your question: for example, docker setup, database schema, search functionality,...
|
||||||
|
|
||||||
|
**Question**
|
||||||
|
Try to be as specific as possible so we can help you as best we can. Please be patient 🙏
|
||||||
11
.github/PULL_REQUEST_TEMPLATE.md
vendored
Normal file
11
.github/PULL_REQUEST_TEMPLATE.md
vendored
Normal file
@@ -0,0 +1,11 @@
|
|||||||
|
## Purpose
|
||||||
|
|
||||||
|
Description...
|
||||||
|
|
||||||
|
|
||||||
|
## Proposal
|
||||||
|
|
||||||
|
Description...
|
||||||
|
|
||||||
|
- [] item 1...
|
||||||
|
- [] item 2...
|
||||||
226
.github/workflows/publish.yml
vendored
Normal file
226
.github/workflows/publish.yml
vendored
Normal file
@@ -0,0 +1,226 @@
|
|||||||
|
name: publish Workflow
|
||||||
|
|
||||||
|
on:
|
||||||
|
push:
|
||||||
|
branches:
|
||||||
|
- main
|
||||||
|
tags:
|
||||||
|
- 'v*'
|
||||||
|
pull_request:
|
||||||
|
branches:
|
||||||
|
- '*'
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
lint-git:
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
if: github.event_name == 'pull_request' # Makes sense only for pull requests
|
||||||
|
steps:
|
||||||
|
- name: Checkout repository
|
||||||
|
uses: actions/checkout@v2
|
||||||
|
with:
|
||||||
|
fetch-depth: 0
|
||||||
|
- name: show
|
||||||
|
run: git log
|
||||||
|
- name: Enforce absence of print statements in code
|
||||||
|
run: |
|
||||||
|
! git diff origin/${{ github.event.pull_request.base.ref }}..HEAD -- . ':(exclude)**/publish.yml' | grep "print("
|
||||||
|
- name: Check absence of fixup commits
|
||||||
|
run: |
|
||||||
|
! git log | grep 'fixup!'
|
||||||
|
- name: Install gitlint
|
||||||
|
run: pip install --user requests gitlint
|
||||||
|
- name: Lint commit messages added to main
|
||||||
|
run: ~/.local/bin/gitlint --commits origin/${{ github.event.pull_request.base.ref }}..HEAD
|
||||||
|
|
||||||
|
check-changelog:
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
steps:
|
||||||
|
- name: Checkout repository
|
||||||
|
uses: actions/checkout@v2
|
||||||
|
- name: Check that the CHANGELOG has been modified in the current branch
|
||||||
|
run: git whatchanged --name-only --pretty="" origin..HEAD | grep CHANGELOG
|
||||||
|
|
||||||
|
lint-changelog:
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
steps:
|
||||||
|
- name: Checkout repository
|
||||||
|
uses: actions/checkout@v2
|
||||||
|
- name: Check CHANGELOG max line length
|
||||||
|
run: |
|
||||||
|
max_line_length=$(cat CHANGELOG.md | grep -Ev "^\[.*\]: https://github.com" | wc -L)
|
||||||
|
if [ $max_line_length -ge 80 ]; then
|
||||||
|
echo "ERROR: CHANGELOG has lines longer than 80 characters."
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
build-mails:
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
defaults:
|
||||||
|
run:
|
||||||
|
working-directory: src/mail
|
||||||
|
steps:
|
||||||
|
- name: Checkout repository
|
||||||
|
uses: actions/checkout@v2
|
||||||
|
- name: Install Node.js
|
||||||
|
uses: actions/setup-node@v4
|
||||||
|
with:
|
||||||
|
node-version: '18'
|
||||||
|
- name: Install yarn
|
||||||
|
run: npm install -g yarn
|
||||||
|
- name: Install node dependencies
|
||||||
|
run: yarn install --frozen-lockfile
|
||||||
|
- name: Build mails
|
||||||
|
run: yarn build
|
||||||
|
|
||||||
|
build-docker:
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
steps:
|
||||||
|
- name: Checkout repository
|
||||||
|
uses: actions/checkout@v2
|
||||||
|
- name: Generate a version.json file describing app release
|
||||||
|
run: |
|
||||||
|
printf '{"commit":"${{ github.sha }}","version":"${{ github.ref }}","source":"https://github.com/${{ github.repository_owner }}/${{ github.repository }}","build":"${{ github.run_id }}"}\n' > src/backend/publish/version.json
|
||||||
|
- name: Set up Docker Buildx
|
||||||
|
uses: docker/setup-buildx-action@v1
|
||||||
|
- name: Build production image
|
||||||
|
run: docker build -t publish:${{ github.sha }} --target production .
|
||||||
|
- name: Check built image availability
|
||||||
|
run: docker images "publish:${{ github.sha }}*"
|
||||||
|
|
||||||
|
lint-back:
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
defaults:
|
||||||
|
run:
|
||||||
|
working-directory: src/backend
|
||||||
|
steps:
|
||||||
|
- name: Checkout repository
|
||||||
|
uses: actions/checkout@v2
|
||||||
|
- name: Install Python
|
||||||
|
uses: actions/setup-python@v3
|
||||||
|
with:
|
||||||
|
python-version: '3.10'
|
||||||
|
- name: Install development dependencies
|
||||||
|
run: pip install --user .[dev]
|
||||||
|
- name: Check code formatting with ruff
|
||||||
|
run: ~/.local/bin/ruff format publish --diff
|
||||||
|
- name: Lint code with ruff
|
||||||
|
run: ~/.local/bin/ruff check publish
|
||||||
|
- name: Lint code with pylint
|
||||||
|
run: ~/.local/bin/pylint publish
|
||||||
|
|
||||||
|
test-back:
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
defaults:
|
||||||
|
run:
|
||||||
|
working-directory: src/backend
|
||||||
|
|
||||||
|
services:
|
||||||
|
postgres:
|
||||||
|
image: postgres:16
|
||||||
|
env:
|
||||||
|
POSTGRES_DB: publish
|
||||||
|
POSTGRES_USER: dinum
|
||||||
|
POSTGRES_PASSWORD: pass
|
||||||
|
ports:
|
||||||
|
- 5432:5432
|
||||||
|
# needed because the postgres container does not provide a healthcheck
|
||||||
|
options: --health-cmd pg_isready --health-interval 10s --health-timeout 5s --health-retries 5
|
||||||
|
|
||||||
|
env:
|
||||||
|
DJANGO_CONFIGURATION: Test
|
||||||
|
DJANGO_SETTINGS_MODULE: publish.settings
|
||||||
|
DJANGO_SECRET_KEY: ThisIsAnExampleKeyForTestPurposeOnly
|
||||||
|
DJANGO_JWT_PRIVATE_SIGNING_KEY: ThisIsAnExampleKeyForDevPurposeOnly
|
||||||
|
DB_HOST: localhost
|
||||||
|
DB_NAME: publish
|
||||||
|
DB_USER: dinum
|
||||||
|
DB_PASSWORD: pass
|
||||||
|
DB_PORT: 5432
|
||||||
|
|
||||||
|
steps:
|
||||||
|
- name: Checkout repository
|
||||||
|
uses: actions/checkout@v2
|
||||||
|
- name: Create writable /data
|
||||||
|
run: |
|
||||||
|
sudo mkdir -p /data/media && \
|
||||||
|
sudo mkdir -p /data/static
|
||||||
|
- name: Install Python
|
||||||
|
uses: actions/setup-python@v3
|
||||||
|
with:
|
||||||
|
python-version: '3.10'
|
||||||
|
- name: Install development dependencies
|
||||||
|
run: pip install --user .[dev]
|
||||||
|
- name: Install gettext (required to compile messages)
|
||||||
|
run: |
|
||||||
|
sudo apt-get update
|
||||||
|
sudo apt-get install -y gettext
|
||||||
|
- name: Generate a MO file from strings extracted from the project
|
||||||
|
run: python manage.py compilemessages
|
||||||
|
- name: Run tests
|
||||||
|
run: ~/.local/bin/pytest -n 2
|
||||||
|
|
||||||
|
i18n-back:
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
steps:
|
||||||
|
- name: Checkout repository
|
||||||
|
uses: actions/checkout@v2
|
||||||
|
- name: Install gettext (required to make messages)
|
||||||
|
run: |
|
||||||
|
sudo apt-get update
|
||||||
|
sudo apt-get install -y gettext
|
||||||
|
- name: Install Python
|
||||||
|
uses: actions/setup-python@v3
|
||||||
|
with:
|
||||||
|
python-version: '3.10'
|
||||||
|
- name: Install development dependencies
|
||||||
|
working-directory: src/backend
|
||||||
|
run: pip install --user .[dev]
|
||||||
|
- name: Generate the translation base file
|
||||||
|
run: ~/.local/bin/django-admin makemessages --keep-pot --all
|
||||||
|
- name: Upload files to Crowdin
|
||||||
|
run: |
|
||||||
|
docker run \
|
||||||
|
--rm \
|
||||||
|
-e CROWDIN_API_TOKEN=${{ secrets.CROWDIN_API_TOKEN }} \
|
||||||
|
-e CROWDIN_PROJECT_ID=${{ vars.CROWDIN_PROJECT_ID }} \
|
||||||
|
-e CROWDIN_BASE_PATH=${{ vars.CROWDIN_BASE_PATH }} \
|
||||||
|
-v "${{ github.workspace }}:/app" \
|
||||||
|
crowdin/cli:3.16.0 \
|
||||||
|
crowdin upload sources -c /app/crowdin/config.yml
|
||||||
|
|
||||||
|
hub:
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
if: github.ref == 'refs/heads/main' || startsWith(github.ref, 'refs/tags/v')
|
||||||
|
steps:
|
||||||
|
- name: Checkout repository
|
||||||
|
uses: actions/checkout@v2
|
||||||
|
- name: Generate a version.json file describing app release
|
||||||
|
run: |
|
||||||
|
printf '{"commit":"${{ github.sha }}","version":"${{ github.ref }}","source":"https://github.com/${{ github.repository_owner }}/${{ github.repository }}","build":"${{ github.run_id }}"}\n' > src/backend/publish/version.json
|
||||||
|
- name: Set up Docker Buildx
|
||||||
|
uses: docker/setup-buildx-action@v1
|
||||||
|
- name: Build production image
|
||||||
|
run: docker build -t publish:${{ github.sha }} --target production .
|
||||||
|
- name: Check built images availability
|
||||||
|
run: docker images "publish:${{ github.sha }}*"
|
||||||
|
- name: Login to DockerHub
|
||||||
|
run: echo "${{ secrets.DOCKER_HUB_PASSWORD }}" | docker login -u "${{ secrets.DOCKER_HUB_USER }}" --password-stdin
|
||||||
|
- name: Tag images
|
||||||
|
run: |
|
||||||
|
DOCKER_TAG=$([[ -z "${{ github.event.ref }}" ]] && echo "${{ github.event.ref }}" || echo "${{ github.event.ref }}" | sed 's/^v//')
|
||||||
|
RELEASE_TYPE=$([[ -z "${{ github.event.ref }}" ]] && echo "branch" || echo "tag ")
|
||||||
|
echo "DOCKER_TAG: ${DOCKER_TAG} (Git ${RELEASE_TYPE}${{ github.event.ref }})"
|
||||||
|
docker tag publish:${{ github.sha }} numerique-gouv/publish:${DOCKER_TAG}
|
||||||
|
if [[ -n "${{ github.event.ref }}" ]]; then
|
||||||
|
docker tag publish:${{ github.sha }} numerique-gouv/publish:latest
|
||||||
|
fi
|
||||||
|
docker images | grep -E "^numerique-gouv/publish\s*(${DOCKER_TAG}.*|latest|main)"
|
||||||
|
- name: Publish images
|
||||||
|
run: |
|
||||||
|
DOCKER_TAG=$([[ -z "${{ github.event.ref }}" ]] && echo "${{ github.event.ref }}" || echo "${{ github.event.ref }}" | sed 's/^v//')
|
||||||
|
RELEASE_TYPE=$([[ -z "${{ github.event.ref }}" ]] && echo "branch" || echo "tag ")
|
||||||
|
echo "DOCKER_TAG: ${DOCKER_TAG} (Git ${RELEASE_TYPE}${{ github.event.ref }})"
|
||||||
|
docker push numerique-gouv/publish:${DOCKER_TAG}
|
||||||
|
if [[ -n "${{ github.event.ref }}" ]]; then
|
||||||
|
docker push numerique-gouv/publish:latest
|
||||||
|
fi
|
||||||
80
.gitignore
vendored
Normal file
80
.gitignore
vendored
Normal file
@@ -0,0 +1,80 @@
|
|||||||
|
# Byte-compiled / optimized / DLL files
|
||||||
|
__pycache__/
|
||||||
|
*.py[cod]
|
||||||
|
*$py.class
|
||||||
|
|
||||||
|
# C extensions
|
||||||
|
*.so
|
||||||
|
|
||||||
|
# Distribution / packaging
|
||||||
|
.Python
|
||||||
|
build/
|
||||||
|
develop-eggs/
|
||||||
|
dist/
|
||||||
|
downloads/
|
||||||
|
eggs/
|
||||||
|
.eggs/
|
||||||
|
lib/
|
||||||
|
lib64/
|
||||||
|
parts/
|
||||||
|
sdist/
|
||||||
|
var/
|
||||||
|
wheels/
|
||||||
|
pip-wheel-metadata/
|
||||||
|
share/python-wheels/
|
||||||
|
*.egg-info/
|
||||||
|
.installed.cfg
|
||||||
|
*.egg
|
||||||
|
MANIFEST
|
||||||
|
.DS_Store
|
||||||
|
.next/
|
||||||
|
|
||||||
|
# Translations # Translations
|
||||||
|
*.pot
|
||||||
|
|
||||||
|
# Environments
|
||||||
|
.env
|
||||||
|
.venv
|
||||||
|
env/
|
||||||
|
venv/
|
||||||
|
ENV/
|
||||||
|
env.bak/
|
||||||
|
venv.bak/
|
||||||
|
env.d/development/*
|
||||||
|
!env.d/development/*.dist
|
||||||
|
env.d/terraform
|
||||||
|
|
||||||
|
# npm
|
||||||
|
node_modules
|
||||||
|
|
||||||
|
# Mails
|
||||||
|
src/backend/core/templates/mail/
|
||||||
|
|
||||||
|
# Typescript client
|
||||||
|
src/frontend/tsclient
|
||||||
|
|
||||||
|
# Swagger
|
||||||
|
**/swagger.json
|
||||||
|
|
||||||
|
# Logs
|
||||||
|
*.log
|
||||||
|
|
||||||
|
# Terraform
|
||||||
|
.terraform
|
||||||
|
*.tfstate
|
||||||
|
*.tfstate.backup
|
||||||
|
|
||||||
|
# Test & lint
|
||||||
|
.coverage
|
||||||
|
.pylint.d
|
||||||
|
.pytest_cache
|
||||||
|
db.sqlite3
|
||||||
|
.mypy_cache
|
||||||
|
|
||||||
|
# Site media
|
||||||
|
/data/
|
||||||
|
|
||||||
|
# IDEs
|
||||||
|
.idea/
|
||||||
|
.vscode/
|
||||||
|
*.iml
|
||||||
78
.gitlint
Normal file
78
.gitlint
Normal file
@@ -0,0 +1,78 @@
|
|||||||
|
# All these sections are optional, edit this file as you like.
|
||||||
|
[general]
|
||||||
|
# Ignore certain rules, you can reference them by their id or by their full name
|
||||||
|
# ignore=title-trailing-punctuation, T3
|
||||||
|
|
||||||
|
# verbosity should be a value between 1 and 3, the commandline -v flags take precedence over this
|
||||||
|
# verbosity = 2
|
||||||
|
|
||||||
|
# By default gitlint will ignore merge commits. Set to 'false' to disable.
|
||||||
|
# ignore-merge-commits=true
|
||||||
|
|
||||||
|
# By default gitlint will ignore fixup commits. Set to 'false' to disable.
|
||||||
|
# ignore-fixup-commits=true
|
||||||
|
|
||||||
|
# By default gitlint will ignore squash commits. Set to 'false' to disable.
|
||||||
|
# ignore-squash-commits=true
|
||||||
|
|
||||||
|
# Enable debug mode (prints more output). Disabled by default.
|
||||||
|
# debug=true
|
||||||
|
|
||||||
|
# Set the extra-path where gitlint will search for user defined rules
|
||||||
|
# See http://jorisroovers.github.io/gitlint/user_defined_rules for details
|
||||||
|
extra-path=gitlint/
|
||||||
|
|
||||||
|
# [title-max-length]
|
||||||
|
# line-length=80
|
||||||
|
|
||||||
|
[title-must-not-contain-word]
|
||||||
|
# Comma-separated list of words that should not occur in the title. Matching is case
|
||||||
|
# insensitive. It's fine if the keyword occurs as part of a larger word (so "WIPING"
|
||||||
|
# will not cause a violation, but "WIP: my title" will.
|
||||||
|
words=wip
|
||||||
|
|
||||||
|
#[title-match-regex]
|
||||||
|
# python like regex (https://docs.python.org/2/library/re.html) that the
|
||||||
|
# commit-msg title must be matched to.
|
||||||
|
# Note that the regex can contradict with other rules if not used correctly
|
||||||
|
# (e.g. title-must-not-contain-word).
|
||||||
|
#regex=
|
||||||
|
|
||||||
|
# [B1]
|
||||||
|
# B1 = body-max-line-length
|
||||||
|
# line-length=120
|
||||||
|
# [body-min-length]
|
||||||
|
# min-length=5
|
||||||
|
|
||||||
|
# [body-is-missing]
|
||||||
|
# Whether to ignore this rule on merge commits (which typically only have a title)
|
||||||
|
# default = True
|
||||||
|
# ignore-merge-commits=false
|
||||||
|
|
||||||
|
# [body-changed-file-mention]
|
||||||
|
# List of files that need to be explicitly mentioned in the body when they are changed
|
||||||
|
# This is useful for when developers often erroneously edit certain files or git submodules.
|
||||||
|
# By specifying this rule, developers can only change the file when they explicitly reference
|
||||||
|
# it in the commit message.
|
||||||
|
# files=gitlint/rules.py,README.md
|
||||||
|
|
||||||
|
# [author-valid-email]
|
||||||
|
# python like regex (https://docs.python.org/2/library/re.html) that the
|
||||||
|
# commit author email address should be matched to
|
||||||
|
# For example, use the following regex if you only want to allow email addresses from foo.com
|
||||||
|
# regex=[^@]+@foo.com
|
||||||
|
|
||||||
|
[ignore-by-title]
|
||||||
|
# Allow empty body & wrong title pattern only when bots (pyup/greenkeeper)
|
||||||
|
# upgrade dependencies
|
||||||
|
regex=^(⬆️.*|Update (.*) from (.*) to (.*)|(chore|fix)\(package\): update .*)$
|
||||||
|
ignore=B6,UC1
|
||||||
|
|
||||||
|
# [ignore-by-body]
|
||||||
|
# Ignore certain rules for commits of which the body has a line that matches a regex
|
||||||
|
# E.g. Match bodies that have a line that that contain "release"
|
||||||
|
# regex=(.*)release(.*)
|
||||||
|
#
|
||||||
|
# Ignore certain rules, you can reference them by their id or by their full name
|
||||||
|
# Use 'all' to ignore all rules
|
||||||
|
# ignore=T1,body-min-length
|
||||||
9
CHANGELOG.md
Normal file
9
CHANGELOG.md
Normal file
@@ -0,0 +1,9 @@
|
|||||||
|
# Changelog
|
||||||
|
|
||||||
|
All notable changes to this project will be documented in this file.
|
||||||
|
|
||||||
|
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0),
|
||||||
|
and this project adheres to
|
||||||
|
[Semantic Versioning](https://semver.org/spec/v2.0.0.html).
|
||||||
|
|
||||||
|
## [Unreleased]
|
||||||
139
Dockerfile
Normal file
139
Dockerfile
Normal file
@@ -0,0 +1,139 @@
|
|||||||
|
# Django publish
|
||||||
|
|
||||||
|
# ---- base image to inherit from ----
|
||||||
|
FROM python:3.10-slim-bookworm as base
|
||||||
|
|
||||||
|
# Upgrade pip to its latest release to speed up dependencies installation
|
||||||
|
RUN python -m pip install --upgrade pip
|
||||||
|
|
||||||
|
# Upgrade system packages to install security updates
|
||||||
|
# python3-pip python3-cffi python3-brotli \
|
||||||
|
RUN apt-get update && \
|
||||||
|
apt-get -y upgrade && \
|
||||||
|
apt-get -y install \
|
||||||
|
gettext \
|
||||||
|
libpango-1.0-0 libpangoft2-1.0-0 pango1.0-tools && \
|
||||||
|
rm -rf /var/lib/apt/lists/*
|
||||||
|
|
||||||
|
# ---- Back-end builder image ----
|
||||||
|
FROM base as back-builder
|
||||||
|
|
||||||
|
WORKDIR /builder
|
||||||
|
|
||||||
|
# Copy required python dependencies
|
||||||
|
COPY ./src/backend /builder
|
||||||
|
|
||||||
|
RUN mkdir /install && \
|
||||||
|
pip install --prefix=/install .
|
||||||
|
|
||||||
|
# ---- mails ----
|
||||||
|
FROM node:18 as mail-builder
|
||||||
|
|
||||||
|
COPY ./src/mail /mail/app
|
||||||
|
|
||||||
|
WORKDIR /mail/app
|
||||||
|
|
||||||
|
RUN yarn install --frozen-lockfile && \
|
||||||
|
yarn build
|
||||||
|
|
||||||
|
# ---- static link collector ----
|
||||||
|
FROM base as link-collector
|
||||||
|
ARG publish_STATIC_ROOT=/data/static
|
||||||
|
|
||||||
|
# Install rdfind
|
||||||
|
RUN apt-get update && \
|
||||||
|
apt-get install -y \
|
||||||
|
rdfind && \
|
||||||
|
rm -rf /var/lib/apt/lists/*
|
||||||
|
|
||||||
|
# Copy installed python dependencies
|
||||||
|
COPY --from=back-builder /install /usr/local
|
||||||
|
|
||||||
|
# Copy publish application (see .dockerignore)
|
||||||
|
COPY ./src/backend /app/
|
||||||
|
|
||||||
|
WORKDIR /app
|
||||||
|
|
||||||
|
# collectstatic
|
||||||
|
RUN DJANGO_CONFIGURATION=Build DJANGO_JWT_PRIVATE_SIGNING_KEY=Dummy \
|
||||||
|
python manage.py collectstatic --noinput
|
||||||
|
|
||||||
|
# Replace duplicated file by a symlink to decrease the overall size of the
|
||||||
|
# final image
|
||||||
|
RUN rdfind -makesymlinks true -followsymlinks true -makeresultsfile false ${publish_STATIC_ROOT}
|
||||||
|
|
||||||
|
# ---- Core application image ----
|
||||||
|
FROM base as core
|
||||||
|
|
||||||
|
ENV PYTHONUNBUFFERED=1
|
||||||
|
|
||||||
|
# Copy entrypoint
|
||||||
|
COPY ./docker/files/usr/local/bin/entrypoint /usr/local/bin/entrypoint
|
||||||
|
|
||||||
|
# Give the "root" group the same permissions as the "root" user on /etc/passwd
|
||||||
|
# to allow a user belonging to the root group to add new users; typically the
|
||||||
|
# docker user (see entrypoint).
|
||||||
|
RUN chmod g=u /etc/passwd
|
||||||
|
|
||||||
|
# Copy installed python dependencies
|
||||||
|
COPY --from=back-builder /install /usr/local
|
||||||
|
|
||||||
|
# Copy publish application (see .dockerignore)
|
||||||
|
COPY ./src/backend /app/
|
||||||
|
|
||||||
|
WORKDIR /app
|
||||||
|
|
||||||
|
# We wrap commands run in this container by the following entrypoint that
|
||||||
|
# creates a user on-the-fly with the container user ID (see USER) and root group
|
||||||
|
# ID.
|
||||||
|
ENTRYPOINT [ "/usr/local/bin/entrypoint" ]
|
||||||
|
|
||||||
|
# ---- Development image ----
|
||||||
|
FROM core as development
|
||||||
|
|
||||||
|
# Switch back to the root user to install development dependencies
|
||||||
|
USER root:root
|
||||||
|
|
||||||
|
# Install psql
|
||||||
|
RUN apt-get update && \
|
||||||
|
apt-get install -y postgresql-client && \
|
||||||
|
rm -rf /var/lib/apt/lists/*
|
||||||
|
|
||||||
|
# Uninstall publish and re-install it in editable mode along with development
|
||||||
|
# dependencies
|
||||||
|
RUN pip uninstall -y publish
|
||||||
|
RUN pip install -e .[dev]
|
||||||
|
|
||||||
|
# Restore the un-privileged user running the application
|
||||||
|
ARG DOCKER_USER
|
||||||
|
USER ${DOCKER_USER}
|
||||||
|
|
||||||
|
# Target database host (e.g. database engine following docker compose services
|
||||||
|
# name) & port
|
||||||
|
ENV DB_HOST=postgresql \
|
||||||
|
DB_PORT=5432
|
||||||
|
|
||||||
|
# Run django development server
|
||||||
|
CMD python manage.py runserver 0.0.0.0:8000
|
||||||
|
|
||||||
|
# ---- Production image ----
|
||||||
|
FROM core as production
|
||||||
|
|
||||||
|
ARG publish_STATIC_ROOT=/data/static
|
||||||
|
|
||||||
|
# Gunicorn
|
||||||
|
RUN mkdir -p /usr/local/etc/gunicorn
|
||||||
|
COPY docker/files/usr/local/etc/gunicorn/publish.py /usr/local/etc/gunicorn/publish.py
|
||||||
|
|
||||||
|
# Un-privileged user running the application
|
||||||
|
ARG DOCKER_USER
|
||||||
|
USER ${DOCKER_USER}
|
||||||
|
|
||||||
|
# Copy statics
|
||||||
|
COPY --from=link-collector ${publish_STATIC_ROOT} ${publish_STATIC_ROOT}
|
||||||
|
|
||||||
|
# Copy publish mails
|
||||||
|
COPY --from=mail-builder /mail/backend/core/templates/mail /app/core/templates/mail
|
||||||
|
|
||||||
|
# The default command runs gunicorn WSGI server in publish's main module
|
||||||
|
CMD gunicorn -c /usr/local/etc/gunicorn/publish.py publish.wsgi:application
|
||||||
2
LICENSE
2
LICENSE
@@ -1,6 +1,6 @@
|
|||||||
MIT License
|
MIT License
|
||||||
|
|
||||||
Copyright (c) 2023 Direction Interministérielle du Numérique du Gouvernement Français
|
Copyright (c) 2023 Direction Interministérielle du Numérique - Gouvernement Français
|
||||||
|
|
||||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
of this software and associated documentation files (the "Software"), to deal
|
of this software and associated documentation files (the "Software"), to deal
|
||||||
|
|||||||
278
Makefile
Normal file
278
Makefile
Normal file
@@ -0,0 +1,278 @@
|
|||||||
|
# /!\ /!\ /!\ /!\ /!\ /!\ /!\ DISCLAIMER /!\ /!\ /!\ /!\ /!\ /!\ /!\ /!\
|
||||||
|
#
|
||||||
|
# This Makefile is only meant to be used for DEVELOPMENT purpose as we are
|
||||||
|
# changing the user id that will run in the container.
|
||||||
|
#
|
||||||
|
# PLEASE DO NOT USE IT FOR YOUR CI/PRODUCTION/WHATEVER...
|
||||||
|
#
|
||||||
|
# /!\ /!\ /!\ /!\ /!\ /!\ /!\ /!\ /!\ /!\ /!\ /!\ /!\ /!\ /!\ /!\ /!\ /!\
|
||||||
|
#
|
||||||
|
# Note to developers:
|
||||||
|
#
|
||||||
|
# While editing this file, please respect the following statements:
|
||||||
|
#
|
||||||
|
# 1. Every variable should be defined in the ad hoc VARIABLES section with a
|
||||||
|
# relevant subsection
|
||||||
|
# 2. Every new rule should be defined in the ad hoc RULES section with a
|
||||||
|
# relevant subsection depending on the targeted service
|
||||||
|
# 3. Rules should be sorted alphabetically within their section
|
||||||
|
# 4. When a rule has multiple dependencies, you should:
|
||||||
|
# - duplicate the rule name to add the help string (if required)
|
||||||
|
# - write one dependency per line to increase readability and diffs
|
||||||
|
# 5. .PHONY rule statement should be written after the corresponding rule
|
||||||
|
# ==============================================================================
|
||||||
|
# VARIABLES
|
||||||
|
|
||||||
|
BOLD := \033[1m
|
||||||
|
RESET := \033[0m
|
||||||
|
GREEN := \033[1;32m
|
||||||
|
|
||||||
|
|
||||||
|
# -- Database
|
||||||
|
|
||||||
|
DB_HOST = postgresql
|
||||||
|
DB_PORT = 5432
|
||||||
|
|
||||||
|
# -- Docker
|
||||||
|
# Get the current user ID to use for docker run and docker exec commands
|
||||||
|
DOCKER_UID = $(shell id -u)
|
||||||
|
DOCKER_GID = $(shell id -g)
|
||||||
|
DOCKER_USER = $(DOCKER_UID):$(DOCKER_GID)
|
||||||
|
COMPOSE = DOCKER_USER=$(DOCKER_USER) docker compose
|
||||||
|
COMPOSE_EXEC = $(COMPOSE) exec
|
||||||
|
COMPOSE_EXEC_APP = $(COMPOSE_EXEC) app-dev
|
||||||
|
COMPOSE_RUN = $(COMPOSE) run --rm
|
||||||
|
COMPOSE_RUN_APP = $(COMPOSE_RUN) app-dev
|
||||||
|
COMPOSE_RUN_CROWDIN = $(COMPOSE_RUN) crowdin crowdin
|
||||||
|
WAIT_DB = @$(COMPOSE_RUN) dockerize -wait tcp://$(DB_HOST):$(DB_PORT) -timeout 60s
|
||||||
|
|
||||||
|
# -- Backend
|
||||||
|
MANAGE = $(COMPOSE_RUN_APP) python manage.py
|
||||||
|
MAIL_YARN = $(COMPOSE_RUN) -w /app/src/mail node yarn
|
||||||
|
TSCLIENT_YARN = $(COMPOSE_RUN) -w /app/src/tsclient node yarn
|
||||||
|
|
||||||
|
# ==============================================================================
|
||||||
|
# RULES
|
||||||
|
|
||||||
|
default: help
|
||||||
|
|
||||||
|
data/media:
|
||||||
|
@mkdir -p data/media
|
||||||
|
|
||||||
|
data/static:
|
||||||
|
@mkdir -p data/static
|
||||||
|
|
||||||
|
# -- Project
|
||||||
|
|
||||||
|
bootstrap: ## Prepare Docker images for the project
|
||||||
|
bootstrap: \
|
||||||
|
data/media \
|
||||||
|
data/static \
|
||||||
|
env.d/development/common \
|
||||||
|
env.d/development/crowdin \
|
||||||
|
env.d/development/postgresql \
|
||||||
|
build \
|
||||||
|
run \
|
||||||
|
migrate \
|
||||||
|
i18n-compile \
|
||||||
|
mails-install \
|
||||||
|
mails-build
|
||||||
|
.PHONY: bootstrap
|
||||||
|
|
||||||
|
# -- Docker/compose
|
||||||
|
build: ## build the app-dev container
|
||||||
|
@$(COMPOSE) build app-dev --no-cache
|
||||||
|
.PHONY: build
|
||||||
|
|
||||||
|
down: ## stop and remove containers, networks, images, and volumes
|
||||||
|
@$(COMPOSE) down
|
||||||
|
.PHONY: down
|
||||||
|
|
||||||
|
logs: ## display app-dev logs (follow mode)
|
||||||
|
@$(COMPOSE) logs -f app-dev
|
||||||
|
.PHONY: logs
|
||||||
|
|
||||||
|
run: ## start the wsgi (production) and development server
|
||||||
|
@$(COMPOSE) up --force-recreate -d nginx
|
||||||
|
@$(COMPOSE) up --force-recreate -d app-dev
|
||||||
|
@$(COMPOSE) up --force-recreate -d celery-dev
|
||||||
|
@echo "Wait for postgresql to be up..."
|
||||||
|
@$(WAIT_DB)
|
||||||
|
.PHONY: run
|
||||||
|
|
||||||
|
status: ## an alias for "docker compose ps"
|
||||||
|
@$(COMPOSE) ps
|
||||||
|
.PHONY: status
|
||||||
|
|
||||||
|
stop: ## stop the development server using Docker
|
||||||
|
@$(COMPOSE) stop
|
||||||
|
.PHONY: stop
|
||||||
|
|
||||||
|
# -- Backend
|
||||||
|
|
||||||
|
demo: ## flush db then create a demo for load testing purpose
|
||||||
|
@$(MAKE) resetdb
|
||||||
|
@$(MANAGE) create_demo
|
||||||
|
.PHONY: demo
|
||||||
|
|
||||||
|
# Nota bene: Black should come after isort just in case they don't agree...
|
||||||
|
lint: ## lint back-end python sources
|
||||||
|
lint: \
|
||||||
|
lint-ruff-format \
|
||||||
|
lint-ruff-check \
|
||||||
|
lint-pylint
|
||||||
|
.PHONY: lint
|
||||||
|
|
||||||
|
lint-ruff-format: ## format back-end python sources with ruff
|
||||||
|
@echo 'lint:ruff-format started…'
|
||||||
|
@$(COMPOSE_RUN_APP) ruff format .
|
||||||
|
.PHONY: lint-ruff-format
|
||||||
|
|
||||||
|
lint-ruff-check: ## lint back-end python sources with ruff
|
||||||
|
@echo 'lint:ruff-check started…'
|
||||||
|
@$(COMPOSE_RUN_APP) ruff check . --fix
|
||||||
|
.PHONY: lint-ruff-check
|
||||||
|
|
||||||
|
lint-pylint: ## lint back-end python sources with pylint only on changed files from main
|
||||||
|
@echo 'lint:pylint started…'
|
||||||
|
bin/pylint --diff-only=origin/main
|
||||||
|
.PHONY: lint-pylint
|
||||||
|
|
||||||
|
test: ## run project tests
|
||||||
|
@$(MAKE) test-back-parallel
|
||||||
|
.PHONY: test
|
||||||
|
|
||||||
|
test-back: ## run back-end tests
|
||||||
|
@args="$(filter-out $@,$(MAKECMDGOALS))" && \
|
||||||
|
bin/pytest $${args:-${1}}
|
||||||
|
.PHONY: test-back
|
||||||
|
|
||||||
|
test-back-parallel: ## run all back-end tests in parallel
|
||||||
|
@args="$(filter-out $@,$(MAKECMDGOALS))" && \
|
||||||
|
bin/pytest -n auto $${args:-${1}}
|
||||||
|
.PHONY: test-back-parallel
|
||||||
|
|
||||||
|
|
||||||
|
makemigrations: ## run django makemigrations for the publish project.
|
||||||
|
@echo "$(BOLD)Running makemigrations$(RESET)"
|
||||||
|
@$(COMPOSE) up -d postgresql
|
||||||
|
@$(WAIT_DB)
|
||||||
|
@$(MANAGE) makemigrations
|
||||||
|
.PHONY: makemigrations
|
||||||
|
|
||||||
|
migrate: ## run django migrations for the publish project.
|
||||||
|
@echo "$(BOLD)Running migrations$(RESET)"
|
||||||
|
@$(COMPOSE) up -d postgresql
|
||||||
|
@$(WAIT_DB)
|
||||||
|
@$(MANAGE) migrate
|
||||||
|
.PHONY: migrate
|
||||||
|
|
||||||
|
superuser: ## Create an admin superuser with password "admin"
|
||||||
|
@echo "$(BOLD)Creating a Django superuser$(RESET)"
|
||||||
|
@$(MANAGE) createsuperuser --email admin@example.com --password admin
|
||||||
|
.PHONY: superuser
|
||||||
|
|
||||||
|
back-i18n-compile: ## compile the gettext files
|
||||||
|
@$(MANAGE) compilemessages --ignore="venv/**/*"
|
||||||
|
.PHONY: back-i18n-compile
|
||||||
|
|
||||||
|
back-i18n-generate: ## create the .pot files used for i18n
|
||||||
|
@$(MANAGE) makemessages -a --keep-pot
|
||||||
|
.PHONY: back-i18n-generate
|
||||||
|
|
||||||
|
shell: ## connect to database shell
|
||||||
|
@$(MANAGE) shell #_plus
|
||||||
|
.PHONY: dbshell
|
||||||
|
|
||||||
|
# -- Database
|
||||||
|
|
||||||
|
dbshell: ## connect to database shell
|
||||||
|
docker compose exec app-dev python manage.py dbshell
|
||||||
|
.PHONY: dbshell
|
||||||
|
|
||||||
|
resetdb: ## flush database and create a superuser "admin"
|
||||||
|
@echo "$(BOLD)Flush database$(RESET)"
|
||||||
|
@$(MANAGE) flush
|
||||||
|
@${MAKE} superuser
|
||||||
|
.PHONY: resetdb
|
||||||
|
|
||||||
|
env.d/development/common:
|
||||||
|
cp -n env.d/development/common.dist env.d/development/common
|
||||||
|
|
||||||
|
env.d/development/postgresql:
|
||||||
|
cp -n env.d/development/postgresql.dist env.d/development/postgresql
|
||||||
|
|
||||||
|
# -- Internationalization
|
||||||
|
|
||||||
|
env.d/development/crowdin:
|
||||||
|
cp -n env.d/development/crowdin.dist env.d/development/crowdin
|
||||||
|
|
||||||
|
crowdin-download: ## Download translated message from crowdin
|
||||||
|
@$(COMPOSE_RUN_CROWDIN) download -c crowdin/config.yml
|
||||||
|
.PHONY: crowdin-download
|
||||||
|
|
||||||
|
crowdin-upload: ## Upload source translations to crowdin
|
||||||
|
@$(COMPOSE_RUN_CROWDIN) upload sources -c crowdin/config.yml
|
||||||
|
.PHONY: crowdin-upload
|
||||||
|
|
||||||
|
i18n-compile: ## compile all translations
|
||||||
|
i18n-compile: \
|
||||||
|
back-i18n-compile
|
||||||
|
.PHONY: i18n-compile
|
||||||
|
|
||||||
|
i18n-generate: ## create the .pot files and extract frontend messages
|
||||||
|
i18n-generate: \
|
||||||
|
back-i18n-generate
|
||||||
|
.PHONY: i18n-generate
|
||||||
|
|
||||||
|
i18n-download-and-compile: ## download all translated messages and compile them to be used by all applications
|
||||||
|
i18n-download-and-compile: \
|
||||||
|
crowdin-download \
|
||||||
|
i18n-compile
|
||||||
|
.PHONY: i18n-download-and-compile
|
||||||
|
|
||||||
|
i18n-generate-and-upload: ## generate source translations for all applications and upload them to crowdin
|
||||||
|
i18n-generate-and-upload: \
|
||||||
|
i18n-generate \
|
||||||
|
crowdin-upload
|
||||||
|
.PHONY: i18n-generate-and-upload
|
||||||
|
|
||||||
|
|
||||||
|
# -- Mail generator
|
||||||
|
|
||||||
|
mails-build: ## Convert mjml files to html and text
|
||||||
|
@$(MAIL_YARN) build
|
||||||
|
.PHONY: mails-build
|
||||||
|
|
||||||
|
mails-build-html-to-plain-text: ## Convert html files to text
|
||||||
|
@$(MAIL_YARN) build-html-to-plain-text
|
||||||
|
.PHONY: mails-build-html-to-plain-text
|
||||||
|
|
||||||
|
mails-build-mjml-to-html: ## Convert mjml files to html and text
|
||||||
|
@$(MAIL_YARN) build-mjml-to-html
|
||||||
|
.PHONY: mails-build-mjml-to-html
|
||||||
|
|
||||||
|
mails-install: ## install the mail generator
|
||||||
|
@$(MAIL_YARN) install
|
||||||
|
.PHONY: mails-install
|
||||||
|
|
||||||
|
# -- TS client generator
|
||||||
|
|
||||||
|
tsclient-install: ## Install the Typescipt API client generator
|
||||||
|
@$(TSCLIENT_YARN) install
|
||||||
|
.PHONY: tsclient-install
|
||||||
|
|
||||||
|
tsclient: tsclient-install ## Generate a Typescipt API client
|
||||||
|
@$(TSCLIENT_YARN) generate:api:client:local ../frontend/tsclient
|
||||||
|
.PHONY: tsclient-install
|
||||||
|
|
||||||
|
# -- Misc
|
||||||
|
clean: ## restore repository state as it was freshly cloned
|
||||||
|
git clean -idx
|
||||||
|
.PHONY: clean
|
||||||
|
|
||||||
|
help:
|
||||||
|
@echo "$(BOLD)publish Makefile"
|
||||||
|
@echo "Please use 'make $(BOLD)target$(RESET)' where $(BOLD)target$(RESET) is one of:"
|
||||||
|
@grep -E '^[a-zA-Z0-9_-]+:.*?## .*$$' $(firstword $(MAKEFILE_LIST)) | sort | awk 'BEGIN {FS = ":.*?## "}; {printf "$(GREEN)%-30s$(RESET) %s\n", $$1, $$2}'
|
||||||
|
.PHONY: help
|
||||||
78
README.md
Normal file
78
README.md
Normal file
@@ -0,0 +1,78 @@
|
|||||||
|
# Publish
|
||||||
|
|
||||||
|
publish is an application to handle users and teams.
|
||||||
|
|
||||||
|
publish is built on top of [Django Rest
|
||||||
|
Framework](https://www.django-rest-framework.org/).
|
||||||
|
|
||||||
|
## Getting started
|
||||||
|
|
||||||
|
### Prerequisite
|
||||||
|
|
||||||
|
Make sure you have a recent version of Docker and [Docker
|
||||||
|
Compose](https://docs.docker.com/compose/install) installed on your laptop:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
$ docker -v
|
||||||
|
Docker version 20.10.2, build 2291f61
|
||||||
|
|
||||||
|
$ docker compose -v
|
||||||
|
docker compose version 1.27.4, build 40524192
|
||||||
|
```
|
||||||
|
|
||||||
|
> ⚠️ You may need to run the following commands with `sudo` but this can be
|
||||||
|
> avoided by assigning your user to the `docker` group.
|
||||||
|
|
||||||
|
### Project bootstrap
|
||||||
|
|
||||||
|
The easiest way to start working on the project is to use GNU Make:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
$ make bootstrap
|
||||||
|
```
|
||||||
|
|
||||||
|
This command builds the `app` container, installs dependencies, performs
|
||||||
|
database migrations and compile translations. It's a good idea to use this
|
||||||
|
command each time you are pulling code from the project repository to avoid
|
||||||
|
dependency-releated or migration-releated issues.
|
||||||
|
|
||||||
|
Your Docker services should now be up and running 🎉
|
||||||
|
|
||||||
|
Note that if you need to run them afterwards, you can use the eponym Make rule:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
$ make run
|
||||||
|
```
|
||||||
|
|
||||||
|
### Adding content
|
||||||
|
|
||||||
|
You can create a basic demo site by running:
|
||||||
|
|
||||||
|
$ make demo
|
||||||
|
|
||||||
|
Finally, you can check all available Make rules using:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
$ make help
|
||||||
|
```
|
||||||
|
|
||||||
|
### Django admin
|
||||||
|
|
||||||
|
You can access the Django admin site at
|
||||||
|
[http://localhost:8071/admin](http://localhost:8071/admin).
|
||||||
|
|
||||||
|
You first need to create a superuser account:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
$ make superuser
|
||||||
|
```
|
||||||
|
|
||||||
|
## Contributing
|
||||||
|
|
||||||
|
This project is intended to be community-driven, so please, do not hesitate to
|
||||||
|
get in touch if you have any question related to our implementation or design
|
||||||
|
decisions.
|
||||||
|
|
||||||
|
## License
|
||||||
|
|
||||||
|
This work is released under the MIT License (see [LICENSE](./LICENSE)).
|
||||||
17
UPGRADE.md
Normal file
17
UPGRADE.md
Normal file
@@ -0,0 +1,17 @@
|
|||||||
|
# Upgrade
|
||||||
|
|
||||||
|
All instructions to upgrade this project from one release to the next will be
|
||||||
|
documented in this file. Upgrades must be run sequentially, meaning you should
|
||||||
|
not skip minor/major releases while upgrading (fix releases can be skipped).
|
||||||
|
|
||||||
|
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
|
||||||
|
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
|
||||||
|
|
||||||
|
For most upgrades, you just need to run the django migrations with
|
||||||
|
the following command inside your docker container:
|
||||||
|
|
||||||
|
`python manage.py migrate`
|
||||||
|
|
||||||
|
(Note : in your development environment, you can `make migrate`.)
|
||||||
|
|
||||||
|
## [Unreleased]
|
||||||
157
bin/_config.sh
Normal file
157
bin/_config.sh
Normal file
@@ -0,0 +1,157 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
|
||||||
|
set -eo pipefail
|
||||||
|
|
||||||
|
REPO_DIR="$(cd "$( dirname "${BASH_SOURCE[0]}" )/.." && pwd)"
|
||||||
|
UNSET_USER=0
|
||||||
|
|
||||||
|
TERRAFORM_DIRECTORY="./env.d/terraform"
|
||||||
|
COMPOSE_FILE="${REPO_DIR}/docker-compose.yml"
|
||||||
|
COMPOSE_PROJECT="publish"
|
||||||
|
|
||||||
|
|
||||||
|
# _set_user: set (or unset) default user id used to run docker commands
|
||||||
|
#
|
||||||
|
# usage: _set_user
|
||||||
|
#
|
||||||
|
# You can override default user ID (the current host user ID), by defining the
|
||||||
|
# USER_ID environment variable.
|
||||||
|
#
|
||||||
|
# To avoid running docker commands with a custom user, please set the
|
||||||
|
# $UNSET_USER environment variable to 1.
|
||||||
|
function _set_user() {
|
||||||
|
|
||||||
|
if [ $UNSET_USER -eq 1 ]; then
|
||||||
|
USER_ID=""
|
||||||
|
return
|
||||||
|
fi
|
||||||
|
|
||||||
|
# USER_ID = USER_ID or `id -u` if USER_ID is not set
|
||||||
|
USER_ID=${USER_ID:-$(id -u)}
|
||||||
|
|
||||||
|
echo "🙋(user) ID: ${USER_ID}"
|
||||||
|
}
|
||||||
|
|
||||||
|
# docker_compose: wrap docker compose command
|
||||||
|
#
|
||||||
|
# usage: docker_compose [options] [ARGS...]
|
||||||
|
#
|
||||||
|
# options: docker compose command options
|
||||||
|
# ARGS : docker compose command arguments
|
||||||
|
function _docker_compose() {
|
||||||
|
|
||||||
|
echo "🐳(compose) project: '${COMPOSE_PROJECT}' file: '${COMPOSE_FILE}'"
|
||||||
|
docker compose \
|
||||||
|
-p "${COMPOSE_PROJECT}" \
|
||||||
|
-f "${COMPOSE_FILE}" \
|
||||||
|
--project-directory "${REPO_DIR}" \
|
||||||
|
"$@"
|
||||||
|
}
|
||||||
|
|
||||||
|
# _dc_run: wrap docker compose run command
|
||||||
|
#
|
||||||
|
# usage: _dc_run [options] [ARGS...]
|
||||||
|
#
|
||||||
|
# options: docker compose run command options
|
||||||
|
# ARGS : docker compose run command arguments
|
||||||
|
function _dc_run() {
|
||||||
|
_set_user
|
||||||
|
|
||||||
|
user_args="--user=$USER_ID"
|
||||||
|
if [ -z $USER_ID ]; then
|
||||||
|
user_args=""
|
||||||
|
fi
|
||||||
|
|
||||||
|
_docker_compose run --rm $user_args "$@"
|
||||||
|
}
|
||||||
|
|
||||||
|
# _dc_exec: wrap docker compose exec command
|
||||||
|
#
|
||||||
|
# usage: _dc_exec [options] [ARGS...]
|
||||||
|
#
|
||||||
|
# options: docker compose exec command options
|
||||||
|
# ARGS : docker compose exec command arguments
|
||||||
|
function _dc_exec() {
|
||||||
|
_set_user
|
||||||
|
|
||||||
|
echo "🐳(compose) exec command: '\$@'"
|
||||||
|
|
||||||
|
user_args="--user=$USER_ID"
|
||||||
|
if [ -z $USER_ID ]; then
|
||||||
|
user_args=""
|
||||||
|
fi
|
||||||
|
|
||||||
|
_docker_compose exec $user_args "$@"
|
||||||
|
}
|
||||||
|
|
||||||
|
# _django_manage: wrap django's manage.py command with docker compose
|
||||||
|
#
|
||||||
|
# usage : _django_manage [ARGS...]
|
||||||
|
#
|
||||||
|
# ARGS : django's manage.py command arguments
|
||||||
|
function _django_manage() {
|
||||||
|
_dc_run "app-dev" python manage.py "$@"
|
||||||
|
}
|
||||||
|
|
||||||
|
# _set_openstack_project: select an OpenStack project from the openrc files defined in the
|
||||||
|
# terraform directory.
|
||||||
|
#
|
||||||
|
# usage: _set_openstack_project
|
||||||
|
#
|
||||||
|
# If necessary the script will prompt the user to choose a project from those available
|
||||||
|
function _set_openstack_project() {
|
||||||
|
|
||||||
|
declare prompt
|
||||||
|
declare -a projects
|
||||||
|
declare -i default=1
|
||||||
|
declare -i choice=0
|
||||||
|
declare -i n_projects
|
||||||
|
|
||||||
|
# List projects by looking in the "./env.d/terraform" directory
|
||||||
|
# and store them in an array
|
||||||
|
read -r -a projects <<< "$(
|
||||||
|
find "${TERRAFORM_DIRECTORY}" -maxdepth 1 -mindepth 1 -type d |
|
||||||
|
sed 's|'"${TERRAFORM_DIRECTORY}\/"'||' |
|
||||||
|
xargs
|
||||||
|
)"
|
||||||
|
nb_projects=${#projects[@]}
|
||||||
|
|
||||||
|
if [[ ${nb_projects} -le 0 ]]; then
|
||||||
|
echo "There are no OpenStack projects defined..." >&2
|
||||||
|
echo "To add one, create a subdirectory in \"${TERRAFORM_DIRECTORY}\" with the name" \
|
||||||
|
"of your project and copy your \"openrc.sh\" file into it." >&2
|
||||||
|
exit 10
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [[ ${nb_projects} -gt 1 ]]; then
|
||||||
|
prompt="Select an OpenStack project to target:\\n"
|
||||||
|
for (( i=0; i<nb_projects; i++ )); do
|
||||||
|
prompt+="[$((i+1))] ${projects[$i]}"
|
||||||
|
if [[ $((i+1)) -eq ${default} ]]; then
|
||||||
|
prompt+=" (default)"
|
||||||
|
fi
|
||||||
|
prompt+="\\n"
|
||||||
|
done
|
||||||
|
prompt+="If your OpenStack project is not listed, add it to the \"env.d/terraform\" directory.\\n"
|
||||||
|
prompt+="Your choice: "
|
||||||
|
read -r -p "$(echo -e "${prompt}")" choice
|
||||||
|
|
||||||
|
if [[ ${choice} -gt nb_projects ]]; then
|
||||||
|
(>&2 echo "Invalid choice ${choice} (should be <= ${nb_projects})")
|
||||||
|
exit 11
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [[ ${choice} -le 0 ]]; then
|
||||||
|
choice=${default}
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
project=${projects[$((choice-1))]}
|
||||||
|
# Check that the openrc.sh file exists for this project
|
||||||
|
if [ ! -f "${TERRAFORM_DIRECTORY}/${project}/openrc.sh" ]; then
|
||||||
|
(>&2 echo "Missing \"openrc.sh\" file in \"${TERRAFORM_DIRECTORY}/${project}\". Check documentation.")
|
||||||
|
exit 12
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo "${project}"
|
||||||
|
}
|
||||||
6
bin/compose
Executable file
6
bin/compose
Executable file
@@ -0,0 +1,6 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
|
||||||
|
# shellcheck source=bin/_config.sh
|
||||||
|
source "$(dirname "${BASH_SOURCE[0]}")/_config.sh"
|
||||||
|
|
||||||
|
_docker_compose "$@"
|
||||||
6
bin/manage
Executable file
6
bin/manage
Executable file
@@ -0,0 +1,6 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
|
||||||
|
# shellcheck source=bin/_config.sh
|
||||||
|
source "$(dirname "${BASH_SOURCE[0]}")/_config.sh"
|
||||||
|
|
||||||
|
_django_manage "$@"
|
||||||
38
bin/pylint
Executable file
38
bin/pylint
Executable file
@@ -0,0 +1,38 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
|
||||||
|
# shellcheck source=bin/_config.sh
|
||||||
|
source "$(dirname "${BASH_SOURCE[0]}")/_config.sh"
|
||||||
|
|
||||||
|
declare diff_from
|
||||||
|
declare -a paths
|
||||||
|
declare -a args
|
||||||
|
|
||||||
|
# Parse options
|
||||||
|
for arg in "$@"
|
||||||
|
do
|
||||||
|
case $arg in
|
||||||
|
--diff-only=*)
|
||||||
|
diff_from="${arg#*=}"
|
||||||
|
shift
|
||||||
|
;;
|
||||||
|
-*)
|
||||||
|
args+=("$arg")
|
||||||
|
shift
|
||||||
|
;;
|
||||||
|
*)
|
||||||
|
paths+=("$arg")
|
||||||
|
shift
|
||||||
|
;;
|
||||||
|
esac
|
||||||
|
done
|
||||||
|
|
||||||
|
if [[ -n "${diff_from}" ]]; then
|
||||||
|
# Run pylint only on modified files located in src/backend
|
||||||
|
# (excluding deleted files and migration files)
|
||||||
|
# shellcheck disable=SC2207
|
||||||
|
paths=($(git diff "${diff_from}" --name-only --diff-filter=d -- src/backend ':!**/migrations/*.py' | grep -E '^src/backend/.*\.py$'))
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Fix docker vs local path when project sources are mounted as a volume
|
||||||
|
read -ra paths <<< "$(echo "${paths[@]}" | sed "s|src/backend/||g")"
|
||||||
|
_dc_run app-dev pylint "${paths[@]}" "${args[@]}"
|
||||||
8
bin/pytest
Executable file
8
bin/pytest
Executable file
@@ -0,0 +1,8 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
|
||||||
|
source "$(dirname "${BASH_SOURCE[0]}")/_config.sh"
|
||||||
|
|
||||||
|
_dc_run \
|
||||||
|
-e DJANGO_CONFIGURATION=Test \
|
||||||
|
app-dev \
|
||||||
|
pytest "$@"
|
||||||
25
bin/state
Executable file
25
bin/state
Executable file
@@ -0,0 +1,25 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
set -eo pipefail
|
||||||
|
|
||||||
|
source "$(dirname "${BASH_SOURCE[0]}")/_config.sh"
|
||||||
|
|
||||||
|
project=$(_set_openstack_project)
|
||||||
|
echo "Using \"${project}\" project..."
|
||||||
|
|
||||||
|
source "${TERRAFORM_DIRECTORY}/${project}/openrc.sh"
|
||||||
|
|
||||||
|
# Run Terraform commands in the Hashicorp docker container via docker compose
|
||||||
|
# shellcheck disable=SC2068
|
||||||
|
DOCKER_USER="$(id -u):$(id -g)" \
|
||||||
|
PROJECT="${project}" \
|
||||||
|
docker compose run --rm \
|
||||||
|
-e OS_AUTH_URL \
|
||||||
|
-e OS_IDENTITY_API_VERSION \
|
||||||
|
-e OS_USER_DOMAIN_NAME \
|
||||||
|
-e OS_PROJECT_DOMAIN_NAME \
|
||||||
|
-e OS_TENANT_ID \
|
||||||
|
-e OS_TENANT_NAME \
|
||||||
|
-e OS_USERNAME \
|
||||||
|
-e OS_PASSWORD \
|
||||||
|
-e OS_REGION_NAME \
|
||||||
|
terraform-state "$@"
|
||||||
26
bin/terraform
Executable file
26
bin/terraform
Executable file
@@ -0,0 +1,26 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
set -eo pipefail
|
||||||
|
|
||||||
|
source "$(dirname "${BASH_SOURCE[0]}")/_config.sh"
|
||||||
|
|
||||||
|
project=$(_set_openstack_project)
|
||||||
|
echo "Using \"${project}\" project..."
|
||||||
|
|
||||||
|
source "${TERRAFORM_DIRECTORY}/${project}/openrc.sh"
|
||||||
|
|
||||||
|
# Run Terraform commands in the Hashicorp docker container via docker compose
|
||||||
|
# shellcheck disable=SC2068
|
||||||
|
DOCKER_USER="$(id -u):$(id -g)" \
|
||||||
|
PROJECT="${project}" \
|
||||||
|
docker compose run --rm \
|
||||||
|
-e OS_AUTH_URL \
|
||||||
|
-e OS_IDENTITY_API_VERSION \
|
||||||
|
-e OS_USER_DOMAIN_NAME \
|
||||||
|
-e OS_PROJECT_DOMAIN_NAME \
|
||||||
|
-e OS_TENANT_ID \
|
||||||
|
-e OS_TENANT_NAME \
|
||||||
|
-e OS_USERNAME \
|
||||||
|
-e OS_PASSWORD \
|
||||||
|
-e OS_REGION_NAME \
|
||||||
|
-e TF_VAR_user_name \
|
||||||
|
terraform "$@"
|
||||||
12
bin/update_openapi_schema
Executable file
12
bin/update_openapi_schema
Executable file
@@ -0,0 +1,12 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
|
||||||
|
source "$(dirname "${BASH_SOURCE[0]}")/_config.sh"
|
||||||
|
|
||||||
|
_dc_run \
|
||||||
|
-e DJANGO_CONFIGURATION=Test \
|
||||||
|
app-dev \
|
||||||
|
python manage.py spectacular \
|
||||||
|
--api-version 'v1.0' \
|
||||||
|
--urlconf 'publish.api_urls' \
|
||||||
|
--format openapi-json \
|
||||||
|
--file /app/core/tests/swagger/swagger.json
|
||||||
23
crowdin/config.yml
Normal file
23
crowdin/config.yml
Normal file
@@ -0,0 +1,23 @@
|
|||||||
|
#
|
||||||
|
# Your crowdin's credentials
|
||||||
|
#
|
||||||
|
api_token_env: CROWDIN_API_TOKEN
|
||||||
|
project_id_env: CROWDIN_PROJECT_ID
|
||||||
|
base_path_env: CROWDIN_BASE_PATH
|
||||||
|
|
||||||
|
#
|
||||||
|
# Choose file structure in crowdin
|
||||||
|
# e.g. true or false
|
||||||
|
#
|
||||||
|
preserve_hierarchy: true
|
||||||
|
|
||||||
|
#
|
||||||
|
# Files configuration
|
||||||
|
#
|
||||||
|
files: [
|
||||||
|
{
|
||||||
|
source : "/backend/locale/django.pot",
|
||||||
|
dest: "/backend.pot",
|
||||||
|
translation : "/backend/locale/%locale_with_underscore%/LC_MESSAGES/django.po"
|
||||||
|
},
|
||||||
|
]
|
||||||
135
docker-compose.yml
Normal file
135
docker-compose.yml
Normal file
@@ -0,0 +1,135 @@
|
|||||||
|
version: '3.8'
|
||||||
|
|
||||||
|
services:
|
||||||
|
postgresql:
|
||||||
|
image: postgres:16
|
||||||
|
env_file:
|
||||||
|
- env.d/development/postgresql
|
||||||
|
ports:
|
||||||
|
- "15432:5432"
|
||||||
|
|
||||||
|
redis:
|
||||||
|
image: redis:5
|
||||||
|
|
||||||
|
mailcatcher:
|
||||||
|
image: sj26/mailcatcher:latest
|
||||||
|
ports:
|
||||||
|
- "1081:1080"
|
||||||
|
|
||||||
|
app-dev:
|
||||||
|
build:
|
||||||
|
context: .
|
||||||
|
target: development
|
||||||
|
args:
|
||||||
|
DOCKER_USER: ${DOCKER_USER:-1000}
|
||||||
|
user: ${DOCKER_USER:-1000}
|
||||||
|
image: publish:development
|
||||||
|
environment:
|
||||||
|
- PYLINTHOME=/app/.pylint.d
|
||||||
|
- DJANGO_CONFIGURATION=Development
|
||||||
|
env_file:
|
||||||
|
- env.d/development/common
|
||||||
|
- env.d/development/postgresql
|
||||||
|
ports:
|
||||||
|
- "8071:8000"
|
||||||
|
volumes:
|
||||||
|
- ./src/backend:/app
|
||||||
|
- ./data/media:/data/media
|
||||||
|
- ./data/static:/data/static
|
||||||
|
depends_on:
|
||||||
|
- postgresql
|
||||||
|
- mailcatcher
|
||||||
|
- redis
|
||||||
|
|
||||||
|
celery-dev:
|
||||||
|
user: ${DOCKER_USER:-1000}
|
||||||
|
image: publish:development
|
||||||
|
command: ["celery", "-A", "publish.celery_app", "worker", "-l", "DEBUG"]
|
||||||
|
environment:
|
||||||
|
- DJANGO_CONFIGURATION=Development
|
||||||
|
env_file:
|
||||||
|
- env.d/development/common
|
||||||
|
- env.d/development/postgresql
|
||||||
|
volumes:
|
||||||
|
- ./src/backend:/app
|
||||||
|
- ./data/media:/data/media
|
||||||
|
- ./data/static:/data/static
|
||||||
|
depends_on:
|
||||||
|
- app-dev
|
||||||
|
|
||||||
|
app:
|
||||||
|
build:
|
||||||
|
context: .
|
||||||
|
target: production
|
||||||
|
args:
|
||||||
|
DOCKER_USER: ${DOCKER_USER:-1000}
|
||||||
|
user: ${DOCKER_USER:-1000}
|
||||||
|
image: publish:production
|
||||||
|
environment:
|
||||||
|
- DJANGO_CONFIGURATION=ContinuousIntegration
|
||||||
|
env_file:
|
||||||
|
- env.d/development/common
|
||||||
|
- env.d/development/postgresql
|
||||||
|
volumes:
|
||||||
|
- ./data/media:/data/media
|
||||||
|
depends_on:
|
||||||
|
- postgresql
|
||||||
|
- redis
|
||||||
|
|
||||||
|
celery:
|
||||||
|
user: ${DOCKER_USER:-1000}
|
||||||
|
image: publish:production
|
||||||
|
command: ["celery", "-A", "publish.celery_app", "worker", "-l", "INFO"]
|
||||||
|
environment:
|
||||||
|
- DJANGO_CONFIGURATION=ContinuousIntegration
|
||||||
|
env_file:
|
||||||
|
- env.d/development/common
|
||||||
|
- env.d/development/postgresql
|
||||||
|
depends_on:
|
||||||
|
- app
|
||||||
|
|
||||||
|
nginx:
|
||||||
|
image: nginx:1.25
|
||||||
|
ports:
|
||||||
|
- "8082:8082"
|
||||||
|
volumes:
|
||||||
|
- ./docker/files/etc/nginx/conf.d:/etc/nginx/conf.d:ro
|
||||||
|
- ./data/media:/data/media:ro
|
||||||
|
depends_on:
|
||||||
|
- app
|
||||||
|
|
||||||
|
dockerize:
|
||||||
|
image: jwilder/dockerize
|
||||||
|
|
||||||
|
crowdin:
|
||||||
|
image: crowdin/cli:3.16.0
|
||||||
|
volumes:
|
||||||
|
- ".:/app"
|
||||||
|
env_file:
|
||||||
|
- env.d/development/crowdin
|
||||||
|
user: "${DOCKER_USER:-1000}"
|
||||||
|
working_dir: /app
|
||||||
|
|
||||||
|
node:
|
||||||
|
image: node:18
|
||||||
|
user: "${DOCKER_USER:-1000}"
|
||||||
|
environment:
|
||||||
|
HOME: /tmp
|
||||||
|
volumes:
|
||||||
|
- ".:/app"
|
||||||
|
|
||||||
|
terraform-state:
|
||||||
|
image: hashicorp/terraform:1.6
|
||||||
|
environment:
|
||||||
|
- TF_WORKSPACE=${PROJECT:-} # avoid env conflict in local state
|
||||||
|
user: ${DOCKER_USER:-1000}
|
||||||
|
working_dir: /app
|
||||||
|
volumes:
|
||||||
|
- ./src/terraform/create_state_bucket:/app
|
||||||
|
|
||||||
|
terraform:
|
||||||
|
image: hashicorp/terraform:1.6
|
||||||
|
user: ${DOCKER_USER:-1000}
|
||||||
|
working_dir: /app
|
||||||
|
volumes:
|
||||||
|
- ./src/terraform:/app
|
||||||
19
docker/files/etc/nginx/conf.d/default.conf
Normal file
19
docker/files/etc/nginx/conf.d/default.conf
Normal file
@@ -0,0 +1,19 @@
|
|||||||
|
server {
|
||||||
|
|
||||||
|
listen 8082;
|
||||||
|
server_name localhost;
|
||||||
|
charset utf-8;
|
||||||
|
|
||||||
|
location /media {
|
||||||
|
alias /data/media;
|
||||||
|
}
|
||||||
|
|
||||||
|
location / {
|
||||||
|
proxy_pass http://app:8000;
|
||||||
|
proxy_set_header Host $host;
|
||||||
|
proxy_set_header X-Real-IP $remote_addr;
|
||||||
|
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
|
||||||
|
}
|
||||||
|
|
||||||
|
}
|
||||||
|
|
||||||
35
docker/files/usr/local/bin/entrypoint
Executable file
35
docker/files/usr/local/bin/entrypoint
Executable file
@@ -0,0 +1,35 @@
|
|||||||
|
#!/bin/sh
|
||||||
|
#
|
||||||
|
# The container user (see USER in the Dockerfile) is an un-privileged user that
|
||||||
|
# does not exists and is not created during the build phase (see Dockerfile).
|
||||||
|
# Hence, we use this entrypoint to wrap commands that will be run in the
|
||||||
|
# container to create an entry for this user in the /etc/passwd file.
|
||||||
|
#
|
||||||
|
# The following environment variables may be passed to the container to
|
||||||
|
# customize running user account:
|
||||||
|
#
|
||||||
|
# * USER_NAME: container user name (default: default)
|
||||||
|
# * HOME : container user home directory (default: none)
|
||||||
|
#
|
||||||
|
# To pass environment variables, you can either use the -e option of the docker run command:
|
||||||
|
#
|
||||||
|
# docker run --rm -e USER_NAME=foo -e HOME='/home/foo' publish:latest python manage.py migrate
|
||||||
|
#
|
||||||
|
# or define new variables in an environment file to use with docker or docker compose:
|
||||||
|
#
|
||||||
|
# # env.d/production
|
||||||
|
# USER_NAME=foo
|
||||||
|
# HOME=/home/foo
|
||||||
|
#
|
||||||
|
# docker run --rm --env-file env.d/production publish:latest python manage.py migrate
|
||||||
|
#
|
||||||
|
|
||||||
|
echo "🐳(entrypoint) creating user running in the container..."
|
||||||
|
if ! whoami > /dev/null 2>&1; then
|
||||||
|
if [ -w /etc/passwd ]; then
|
||||||
|
echo "${USER_NAME:-default}:x:$(id -u):$(id -g):${USER_NAME:-default} user:${HOME}:/sbin/nologin" >> /etc/passwd
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo "🐳(entrypoint) running your command: ${*}"
|
||||||
|
exec "$@"
|
||||||
16
docker/files/usr/local/etc/gunicorn/publish.py
Normal file
16
docker/files/usr/local/etc/gunicorn/publish.py
Normal file
@@ -0,0 +1,16 @@
|
|||||||
|
# Gunicorn-django settings
|
||||||
|
bind = ["0.0.0.0:8000"]
|
||||||
|
name = "publish"
|
||||||
|
python_path = "/app"
|
||||||
|
|
||||||
|
# Run
|
||||||
|
graceful_timeout = 90
|
||||||
|
timeout = 90
|
||||||
|
workers = 3
|
||||||
|
|
||||||
|
# Logging
|
||||||
|
# Using '-' for the access log file makes gunicorn log accesses to stdout
|
||||||
|
accesslog = "-"
|
||||||
|
# Using '-' for the error log file makes gunicorn log errors to stderr
|
||||||
|
errorlog = "-"
|
||||||
|
loglevel = "info"
|
||||||
25
docs/tsclient.md
Normal file
25
docs/tsclient.md
Normal file
@@ -0,0 +1,25 @@
|
|||||||
|
# Api client TypeScript
|
||||||
|
|
||||||
|
The backend application can automatically create a TypeScript client to be used in frontend
|
||||||
|
applications. It is used in the publish front application itself.
|
||||||
|
|
||||||
|
This client is made with [openapi-typescript-codegen](https://github.com/ferdikoomen/openapi-typescript-codegen)
|
||||||
|
and publish's backend OpenAPI schema (available [here](http://localhost:8071/v1.0/swagger/) if you have the backend running).
|
||||||
|
|
||||||
|
## Requirements
|
||||||
|
|
||||||
|
We'll need the online OpenAPI schema generated by swagger. Therefore you will first need to
|
||||||
|
install the backend application.
|
||||||
|
|
||||||
|
## Install openApiClientJs
|
||||||
|
|
||||||
|
```sh
|
||||||
|
$ cd src/tsclient
|
||||||
|
$ yarn install
|
||||||
|
```
|
||||||
|
|
||||||
|
## Generate the client
|
||||||
|
|
||||||
|
```sh
|
||||||
|
yarn generate:api:client:local <output_path_for_generated_client>
|
||||||
|
```
|
||||||
20
env.d/development/common.dist
Normal file
20
env.d/development/common.dist
Normal file
@@ -0,0 +1,20 @@
|
|||||||
|
# Django
|
||||||
|
DJANGO_ALLOWED_HOSTS=*
|
||||||
|
DJANGO_SECRET_KEY=ThisIsAnExampleKeyForDevPurposeOnly
|
||||||
|
DJANGO_SETTINGS_MODULE=publish.settings
|
||||||
|
DJANGO_SUPERUSER_PASSWORD=admin
|
||||||
|
|
||||||
|
# Python
|
||||||
|
PYTHONPATH=/app
|
||||||
|
|
||||||
|
#JWT
|
||||||
|
DJANGO_JWT_PRIVATE_SIGNING_KEY=ThisIsAnExampleKeyForDevPurposeOnly
|
||||||
|
|
||||||
|
# publish settings
|
||||||
|
|
||||||
|
# Mail
|
||||||
|
DJANGO_EMAIL_HOST="mailcatcher"
|
||||||
|
DJANGO_EMAIL_PORT=1025
|
||||||
|
|
||||||
|
# Backend url
|
||||||
|
publish_BASE_URL="http://localhost:8072"
|
||||||
3
env.d/development/crowdin.dist
Normal file
3
env.d/development/crowdin.dist
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
CROWDIN_API_TOKEN=Your-Api-Token
|
||||||
|
CROWDIN_PROJECT_ID=Your-Project-Id
|
||||||
|
CROWDIN_BASE_PATH=/app/src
|
||||||
11
env.d/development/postgresql.dist
Normal file
11
env.d/development/postgresql.dist
Normal file
@@ -0,0 +1,11 @@
|
|||||||
|
# Postgresql db container configuration
|
||||||
|
POSTGRES_DB=publish
|
||||||
|
POSTGRES_USER=dinum
|
||||||
|
POSTGRES_PASSWORD=pass
|
||||||
|
|
||||||
|
# App database configuration
|
||||||
|
DB_HOST=postgresql
|
||||||
|
DB_NAME=publish
|
||||||
|
DB_USER=dinum
|
||||||
|
DB_PASSWORD=pass
|
||||||
|
DB_PORT=5432
|
||||||
37
gitlint/gitlint_emoji.py
Normal file
37
gitlint/gitlint_emoji.py
Normal file
@@ -0,0 +1,37 @@
|
|||||||
|
"""
|
||||||
|
Gitlint extra rule to validate that the message title is of the form
|
||||||
|
"<gitmoji>(<scope>) <subject>"
|
||||||
|
"""
|
||||||
|
from __future__ import unicode_literals
|
||||||
|
|
||||||
|
import re
|
||||||
|
|
||||||
|
import requests
|
||||||
|
|
||||||
|
from gitlint.rules import CommitMessageTitle, LineRule, RuleViolation
|
||||||
|
|
||||||
|
|
||||||
|
class GitmojiTitle(LineRule):
|
||||||
|
"""
|
||||||
|
This rule will enforce that each commit title is of the form "<gitmoji>(<scope>) <subject>"
|
||||||
|
where gitmoji is an emoji from the list defined in https://gitmoji.carloscuesta.me and
|
||||||
|
subject should be all lowercase
|
||||||
|
"""
|
||||||
|
|
||||||
|
id = "UC1"
|
||||||
|
name = "title-should-have-gitmoji-and-scope"
|
||||||
|
target = CommitMessageTitle
|
||||||
|
|
||||||
|
def validate(self, title, _commit):
|
||||||
|
"""
|
||||||
|
Download the list possible gitmojis from the project's github repository and check that
|
||||||
|
title contains one of them.
|
||||||
|
"""
|
||||||
|
gitmojis = requests.get(
|
||||||
|
"https://raw.githubusercontent.com/carloscuesta/gitmoji/master/packages/gitmojis/src/gitmojis.json"
|
||||||
|
).json()["gitmojis"]
|
||||||
|
emojis = [item["emoji"] for item in gitmojis]
|
||||||
|
pattern = r"^({:s})\(.*\)\s[a-z].*$".format("|".join(emojis))
|
||||||
|
if not re.search(pattern, title):
|
||||||
|
violation_msg = 'Title does not match regex "<gitmoji>(<scope>) <subject>"'
|
||||||
|
return [RuleViolation(self.id, violation_msg, title)]
|
||||||
25
renovate.json
Normal file
25
renovate.json
Normal file
@@ -0,0 +1,25 @@
|
|||||||
|
{
|
||||||
|
"extends": [
|
||||||
|
"github>numerique-gouv/renovate-configuration"
|
||||||
|
],
|
||||||
|
"packageRules": [
|
||||||
|
{
|
||||||
|
"enabled": false,
|
||||||
|
"groupName": "ignored python dependencies",
|
||||||
|
"matchManagers": [
|
||||||
|
"setup-cfg"
|
||||||
|
],
|
||||||
|
"matchPackageNames": []
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"enabled": false,
|
||||||
|
"groupName": "ignored js dependencies",
|
||||||
|
"matchManagers": [
|
||||||
|
"npm"
|
||||||
|
],
|
||||||
|
"matchPackageNames": [
|
||||||
|
"node"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
472
src/backend/.pylintrc
Normal file
472
src/backend/.pylintrc
Normal file
@@ -0,0 +1,472 @@
|
|||||||
|
[MASTER]
|
||||||
|
|
||||||
|
# A comma-separated list of package or module names from where C extensions may
|
||||||
|
# be loaded. Extensions are loading into the active Python interpreter and may
|
||||||
|
# run arbitrary code
|
||||||
|
extension-pkg-whitelist=
|
||||||
|
|
||||||
|
# Add files or directories to the blacklist. They should be base names, not
|
||||||
|
# paths.
|
||||||
|
ignore=migrations
|
||||||
|
|
||||||
|
# Add files or directories matching the regex patterns to the blacklist. The
|
||||||
|
# regex matches against base names, not paths.
|
||||||
|
ignore-patterns=
|
||||||
|
|
||||||
|
# Python code to execute, usually for sys.path manipulation such as
|
||||||
|
# pygtk.require().
|
||||||
|
#init-hook=
|
||||||
|
|
||||||
|
# Use multiple processes to speed up Pylint. Specifying 0 will auto-detect the
|
||||||
|
# number of processors available to use.
|
||||||
|
jobs=0
|
||||||
|
|
||||||
|
# List of plugins (as comma separated values of python modules names) to load,
|
||||||
|
# usually to register additional checkers.
|
||||||
|
load-plugins=pylint_django,pylint.extensions.no_self_use
|
||||||
|
|
||||||
|
# Pickle collected data for later comparisons.
|
||||||
|
persistent=yes
|
||||||
|
|
||||||
|
# Specify a configuration file.
|
||||||
|
#rcfile=
|
||||||
|
|
||||||
|
# When enabled, pylint would attempt to guess common misconfiguration and emit
|
||||||
|
# user-friendly hints instead of false-positive error messages
|
||||||
|
suggestion-mode=yes
|
||||||
|
|
||||||
|
# Allow loading of arbitrary C extensions. Extensions are imported into the
|
||||||
|
# active Python interpreter and may run arbitrary code.
|
||||||
|
unsafe-load-any-extension=no
|
||||||
|
|
||||||
|
|
||||||
|
[MESSAGES CONTROL]
|
||||||
|
|
||||||
|
# Only show warnings with the listed confidence levels. Leave empty to show
|
||||||
|
# all. Valid levels: HIGH, INFERENCE, INFERENCE_FAILURE, UNDEFINED
|
||||||
|
confidence=
|
||||||
|
|
||||||
|
# Disable the message, report, category or checker with the given id(s). You
|
||||||
|
# can either give multiple identifiers separated by comma (,) or put this
|
||||||
|
# option multiple times (only on the command line, not in the configuration
|
||||||
|
# file where it should appear only once).You can also use "--disable=all" to
|
||||||
|
# disable everything first and then reenable specific checks. For example, if
|
||||||
|
# you want to run only the similarities checker, you can use "--disable=all
|
||||||
|
# --enable=similarities". If you want to run only the classes checker, but have
|
||||||
|
# no Warning level messages displayed, use"--disable=all --enable=classes
|
||||||
|
# --disable=W"
|
||||||
|
disable=bad-inline-option,
|
||||||
|
deprecated-pragma,
|
||||||
|
django-not-configured,
|
||||||
|
file-ignored,
|
||||||
|
locally-disabled,
|
||||||
|
no-self-use,
|
||||||
|
raw-checker-failed,
|
||||||
|
suppressed-message,
|
||||||
|
useless-suppression
|
||||||
|
|
||||||
|
# Enable the message, report, category or checker with the given id(s). You can
|
||||||
|
# either give multiple identifier separated by comma (,) or put this option
|
||||||
|
# multiple time (only on the command line, not in the configuration file where
|
||||||
|
# it should appear only once). See also the "--disable" option for examples.
|
||||||
|
enable=c-extension-no-member
|
||||||
|
|
||||||
|
|
||||||
|
[REPORTS]
|
||||||
|
|
||||||
|
# Python expression which should return a note less than 10 (10 is the highest
|
||||||
|
# note). You have access to the variables errors warning, statement which
|
||||||
|
# respectively contain the number of errors / warnings messages and the total
|
||||||
|
# number of statements analyzed. This is used by the global evaluation report
|
||||||
|
# (RP0004).
|
||||||
|
evaluation=10.0 - ((float(5 * error + warning + refactor + convention) / statement) * 10)
|
||||||
|
|
||||||
|
# Template used to display messages. This is a python new-style format string
|
||||||
|
# used to format the message information. See doc for all details
|
||||||
|
#msg-template=
|
||||||
|
|
||||||
|
# Set the output format. Available formats are text, parseable, colorized, json
|
||||||
|
# and msvs (visual studio).You can also give a reporter class, eg
|
||||||
|
# mypackage.mymodule.MyReporterClass.
|
||||||
|
output-format=text
|
||||||
|
|
||||||
|
# Tells whether to display a full report or only the messages
|
||||||
|
reports=no
|
||||||
|
|
||||||
|
# Activate the evaluation score.
|
||||||
|
score=yes
|
||||||
|
|
||||||
|
|
||||||
|
[REFACTORING]
|
||||||
|
|
||||||
|
# Maximum number of nested blocks for function / method body
|
||||||
|
max-nested-blocks=5
|
||||||
|
|
||||||
|
# Complete name of functions that never returns. When checking for
|
||||||
|
# inconsistent-return-statements if a never returning function is called then
|
||||||
|
# it will be considered as an explicit return statement and no message will be
|
||||||
|
# printed.
|
||||||
|
never-returning-functions=optparse.Values,sys.exit
|
||||||
|
|
||||||
|
|
||||||
|
[LOGGING]
|
||||||
|
|
||||||
|
# Logging modules to check that the string format arguments are in logging
|
||||||
|
# function parameter format
|
||||||
|
logging-modules=logging
|
||||||
|
|
||||||
|
|
||||||
|
[SPELLING]
|
||||||
|
|
||||||
|
# Limits count of emitted suggestions for spelling mistakes
|
||||||
|
max-spelling-suggestions=4
|
||||||
|
|
||||||
|
# Spelling dictionary name. Available dictionaries: none. To make it working
|
||||||
|
# install python-enchant package.
|
||||||
|
spelling-dict=
|
||||||
|
|
||||||
|
# List of comma separated words that should not be checked.
|
||||||
|
spelling-ignore-words=
|
||||||
|
|
||||||
|
# A path to a file that contains private dictionary; one word per line.
|
||||||
|
spelling-private-dict-file=
|
||||||
|
|
||||||
|
# Tells whether to store unknown words to indicated private dictionary in
|
||||||
|
# --spelling-private-dict-file option instead of raising a message.
|
||||||
|
spelling-store-unknown-words=no
|
||||||
|
|
||||||
|
|
||||||
|
[MISCELLANEOUS]
|
||||||
|
|
||||||
|
# List of note tags to take in consideration, separated by a comma.
|
||||||
|
notes=FIXME,
|
||||||
|
XXX,
|
||||||
|
TODO
|
||||||
|
|
||||||
|
|
||||||
|
[TYPECHECK]
|
||||||
|
|
||||||
|
# List of decorators that produce context managers, such as
|
||||||
|
# contextlib.contextmanager. Add to this list to register other decorators that
|
||||||
|
# produce valid context managers.
|
||||||
|
contextmanager-decorators=contextlib.contextmanager
|
||||||
|
|
||||||
|
# List of members which are set dynamically and missed by pylint inference
|
||||||
|
# system, and so shouldn't trigger E1101 when accessed. Python regular
|
||||||
|
# expressions are accepted.
|
||||||
|
generated-members=
|
||||||
|
|
||||||
|
# Tells whether missing members accessed in mixin class should be ignored. A
|
||||||
|
# mixin class is detected if its name ends with "mixin" (case insensitive).
|
||||||
|
ignore-mixin-members=yes
|
||||||
|
|
||||||
|
# This flag controls whether pylint should warn about no-member and similar
|
||||||
|
# checks whenever an opaque object is returned when inferring. The inference
|
||||||
|
# can return multiple potential results while evaluating a Python object, but
|
||||||
|
# some branches might not be evaluated, which results in partial inference. In
|
||||||
|
# that case, it might be useful to still emit no-member and other checks for
|
||||||
|
# the rest of the inferred objects.
|
||||||
|
ignore-on-opaque-inference=yes
|
||||||
|
|
||||||
|
# List of class names for which member attributes should not be checked (useful
|
||||||
|
# for classes with dynamically set attributes). This supports the use of
|
||||||
|
# qualified names.
|
||||||
|
ignored-classes=optparse.Values,thread._local,_thread._local,responses,
|
||||||
|
Team,Contact
|
||||||
|
|
||||||
|
# List of module names for which member attributes should not be checked
|
||||||
|
# (useful for modules/projects where namespaces are manipulated during runtime
|
||||||
|
# and thus existing member attributes cannot be deduced by static analysis. It
|
||||||
|
# supports qualified module names, as well as Unix pattern matching.
|
||||||
|
ignored-modules=
|
||||||
|
|
||||||
|
# Show a hint with possible names when a member name was not found. The aspect
|
||||||
|
# of finding the hint is based on edit distance.
|
||||||
|
missing-member-hint=yes
|
||||||
|
|
||||||
|
# The minimum edit distance a name should have in order to be considered a
|
||||||
|
# similar match for a missing member name.
|
||||||
|
missing-member-hint-distance=1
|
||||||
|
|
||||||
|
# The total number of similar names that should be taken in consideration when
|
||||||
|
# showing a hint for a missing member.
|
||||||
|
missing-member-max-choices=1
|
||||||
|
|
||||||
|
|
||||||
|
[VARIABLES]
|
||||||
|
|
||||||
|
# List of additional names supposed to be defined in builtins. Remember that
|
||||||
|
# you should avoid to define new builtins when possible.
|
||||||
|
additional-builtins=
|
||||||
|
|
||||||
|
# Tells whether unused global variables should be treated as a violation.
|
||||||
|
allow-global-unused-variables=yes
|
||||||
|
|
||||||
|
# List of strings which can identify a callback function by name. A callback
|
||||||
|
# name must start or end with one of those strings.
|
||||||
|
callbacks=cb_,
|
||||||
|
_cb
|
||||||
|
|
||||||
|
# A regular expression matching the name of dummy variables (i.e. expectedly
|
||||||
|
# not used).
|
||||||
|
dummy-variables-rgx=_+$|(_[a-zA-Z0-9_]*[a-zA-Z0-9]+?$)|dummy|^ignored_|^unused_
|
||||||
|
|
||||||
|
# Argument names that match this expression will be ignored. Default to name
|
||||||
|
# with leading underscore
|
||||||
|
ignored-argument-names=_.*|^ignored_|^unused_
|
||||||
|
|
||||||
|
# Tells whether we should check for unused import in __init__ files.
|
||||||
|
init-import=no
|
||||||
|
|
||||||
|
# List of qualified module names which can have objects that can redefine
|
||||||
|
# builtins.
|
||||||
|
redefining-builtins-modules=six.moves,past.builtins,future.builtins
|
||||||
|
|
||||||
|
|
||||||
|
[FORMAT]
|
||||||
|
|
||||||
|
# Expected format of line ending, e.g. empty (any line ending), LF or CRLF.
|
||||||
|
expected-line-ending-format=
|
||||||
|
|
||||||
|
# Regexp for a line that is allowed to be longer than the limit.
|
||||||
|
ignore-long-lines=^\s*(# )?<?https?://\S+>?$
|
||||||
|
|
||||||
|
# Number of spaces of indent required inside a hanging or continued line.
|
||||||
|
indent-after-paren=4
|
||||||
|
|
||||||
|
# String used as indentation unit. This is usually " " (4 spaces) or "\t" (1
|
||||||
|
# tab).
|
||||||
|
indent-string=' '
|
||||||
|
|
||||||
|
# Maximum number of characters on a single line.
|
||||||
|
max-line-length=100
|
||||||
|
|
||||||
|
# Maximum number of lines in a module
|
||||||
|
max-module-lines=1000
|
||||||
|
|
||||||
|
# Allow the body of a class to be on the same line as the declaration if body
|
||||||
|
# contains single statement.
|
||||||
|
single-line-class-stmt=no
|
||||||
|
|
||||||
|
# Allow the body of an if to be on the same line as the test if there is no
|
||||||
|
# else.
|
||||||
|
single-line-if-stmt=no
|
||||||
|
|
||||||
|
|
||||||
|
[SIMILARITIES]
|
||||||
|
|
||||||
|
# Ignore comments when computing similarities.
|
||||||
|
ignore-comments=yes
|
||||||
|
|
||||||
|
# Ignore docstrings when computing similarities.
|
||||||
|
ignore-docstrings=yes
|
||||||
|
|
||||||
|
# Ignore imports when computing similarities.
|
||||||
|
ignore-imports=yes
|
||||||
|
|
||||||
|
# Minimum lines number of a similarity.
|
||||||
|
# First implementations of CMS wizards have common fields we do not want to factorize for now
|
||||||
|
min-similarity-lines=35
|
||||||
|
|
||||||
|
|
||||||
|
[BASIC]
|
||||||
|
|
||||||
|
# Naming style matching correct argument names
|
||||||
|
argument-naming-style=snake_case
|
||||||
|
|
||||||
|
# Regular expression matching correct argument names. Overrides argument-
|
||||||
|
# naming-style
|
||||||
|
#argument-rgx=
|
||||||
|
|
||||||
|
# Naming style matching correct attribute names
|
||||||
|
attr-naming-style=snake_case
|
||||||
|
|
||||||
|
# Regular expression matching correct attribute names. Overrides attr-naming-
|
||||||
|
# style
|
||||||
|
#attr-rgx=
|
||||||
|
|
||||||
|
# Bad variable names which should always be refused, separated by a comma
|
||||||
|
bad-names=foo,
|
||||||
|
bar,
|
||||||
|
baz,
|
||||||
|
toto,
|
||||||
|
tutu,
|
||||||
|
tata
|
||||||
|
|
||||||
|
# Naming style matching correct class attribute names
|
||||||
|
class-attribute-naming-style=any
|
||||||
|
|
||||||
|
# Regular expression matching correct class attribute names. Overrides class-
|
||||||
|
# attribute-naming-style
|
||||||
|
#class-attribute-rgx=
|
||||||
|
|
||||||
|
# Naming style matching correct class names
|
||||||
|
class-naming-style=PascalCase
|
||||||
|
|
||||||
|
# Regular expression matching correct class names. Overrides class-naming-style
|
||||||
|
#class-rgx=
|
||||||
|
|
||||||
|
# Naming style matching correct constant names
|
||||||
|
const-naming-style=UPPER_CASE
|
||||||
|
|
||||||
|
# Regular expression matching correct constant names. Overrides const-naming-
|
||||||
|
# style
|
||||||
|
const-rgx=(([A-Z_][A-Z0-9_]*)|(__.*__)|urlpatterns|logger)$
|
||||||
|
|
||||||
|
# Minimum line length for functions/classes that require docstrings, shorter
|
||||||
|
# ones are exempt.
|
||||||
|
docstring-min-length=-1
|
||||||
|
|
||||||
|
# Naming style matching correct function names
|
||||||
|
function-naming-style=snake_case
|
||||||
|
|
||||||
|
# Regular expression matching correct function names. Overrides function-
|
||||||
|
# naming-style
|
||||||
|
#function-rgx=
|
||||||
|
|
||||||
|
# Good variable names which should always be accepted, separated by a comma
|
||||||
|
good-names=i,
|
||||||
|
j,
|
||||||
|
k,
|
||||||
|
cm,
|
||||||
|
ex,
|
||||||
|
Run,
|
||||||
|
_
|
||||||
|
|
||||||
|
# Include a hint for the correct naming format with invalid-name
|
||||||
|
include-naming-hint=no
|
||||||
|
|
||||||
|
# Naming style matching correct inline iteration names
|
||||||
|
inlinevar-naming-style=any
|
||||||
|
|
||||||
|
# Regular expression matching correct inline iteration names. Overrides
|
||||||
|
# inlinevar-naming-style
|
||||||
|
#inlinevar-rgx=
|
||||||
|
|
||||||
|
# Naming style matching correct method names
|
||||||
|
method-naming-style=snake_case
|
||||||
|
|
||||||
|
# Regular expression matching correct method names. Overrides method-naming-
|
||||||
|
# style
|
||||||
|
method-rgx=([a-z_][a-z0-9_]{2,50}|setUp|set[Uu]pClass|tearDown|tear[Dd]ownClass|assert[A-Z]\w*|maxDiff|test_[a-z0-9_]+)$
|
||||||
|
|
||||||
|
# Naming style matching correct module names
|
||||||
|
module-naming-style=snake_case
|
||||||
|
|
||||||
|
# Regular expression matching correct module names. Overrides module-naming-
|
||||||
|
# style
|
||||||
|
#module-rgx=
|
||||||
|
|
||||||
|
# Colon-delimited sets of names that determine each other's naming style when
|
||||||
|
# the name regexes allow several styles.
|
||||||
|
name-group=
|
||||||
|
|
||||||
|
# Regular expression which should only match function or class names that do
|
||||||
|
# not require a docstring.
|
||||||
|
no-docstring-rgx=^_
|
||||||
|
|
||||||
|
# List of decorators that produce properties, such as abc.abstractproperty. Add
|
||||||
|
# to this list to register other decorators that produce valid properties.
|
||||||
|
property-classes=abc.abstractproperty
|
||||||
|
|
||||||
|
# Naming style matching correct variable names
|
||||||
|
variable-naming-style=snake_case
|
||||||
|
|
||||||
|
# Regular expression matching correct variable names. Overrides variable-
|
||||||
|
# naming-style
|
||||||
|
#variable-rgx=
|
||||||
|
|
||||||
|
|
||||||
|
[IMPORTS]
|
||||||
|
|
||||||
|
# Allow wildcard imports from modules that define __all__.
|
||||||
|
allow-wildcard-with-all=no
|
||||||
|
|
||||||
|
# Analyse import fallback blocks. This can be used to support both Python 2 and
|
||||||
|
# 3 compatible code, which means that the block might have code that exists
|
||||||
|
# only in one or another interpreter, leading to false positives when analysed.
|
||||||
|
analyse-fallback-blocks=no
|
||||||
|
|
||||||
|
# Deprecated modules which should not be used, separated by a comma
|
||||||
|
deprecated-modules=optparse,tkinter.tix
|
||||||
|
|
||||||
|
# Create a graph of external dependencies in the given file (report RP0402 must
|
||||||
|
# not be disabled)
|
||||||
|
ext-import-graph=
|
||||||
|
|
||||||
|
# Create a graph of every (i.e. internal and external) dependencies in the
|
||||||
|
# given file (report RP0402 must not be disabled)
|
||||||
|
import-graph=
|
||||||
|
|
||||||
|
# Create a graph of internal dependencies in the given file (report RP0402 must
|
||||||
|
# not be disabled)
|
||||||
|
int-import-graph=
|
||||||
|
|
||||||
|
# Force import order to recognize a module as part of the standard
|
||||||
|
# compatibility libraries.
|
||||||
|
known-standard-library=
|
||||||
|
|
||||||
|
# Force import order to recognize a module as part of a third party library.
|
||||||
|
known-third-party=enchant
|
||||||
|
|
||||||
|
|
||||||
|
[CLASSES]
|
||||||
|
|
||||||
|
# List of method names used to declare (i.e. assign) instance attributes.
|
||||||
|
defining-attr-methods=__init__,
|
||||||
|
__new__,
|
||||||
|
setUp
|
||||||
|
|
||||||
|
# List of member names, which should be excluded from the protected access
|
||||||
|
# warning.
|
||||||
|
exclude-protected=_asdict,
|
||||||
|
_fields,
|
||||||
|
_replace,
|
||||||
|
_source,
|
||||||
|
_make
|
||||||
|
|
||||||
|
# List of valid names for the first argument in a class method.
|
||||||
|
valid-classmethod-first-arg=cls
|
||||||
|
|
||||||
|
# List of valid names for the first argument in a metaclass class method.
|
||||||
|
valid-metaclass-classmethod-first-arg=mcs
|
||||||
|
|
||||||
|
|
||||||
|
[DESIGN]
|
||||||
|
|
||||||
|
# Maximum number of arguments for function / method
|
||||||
|
max-args=5
|
||||||
|
|
||||||
|
# Maximum number of attributes for a class (see R0902).
|
||||||
|
max-attributes=7
|
||||||
|
|
||||||
|
# Maximum number of boolean expressions in a if statement
|
||||||
|
max-bool-expr=5
|
||||||
|
|
||||||
|
# Maximum number of branch for function / method body
|
||||||
|
max-branches=12
|
||||||
|
|
||||||
|
# Maximum number of locals for function / method body
|
||||||
|
max-locals=15
|
||||||
|
|
||||||
|
# Maximum number of parents for a class (see R0901).
|
||||||
|
max-parents=7
|
||||||
|
|
||||||
|
# Maximum number of public methods for a class (see R0904).
|
||||||
|
max-public-methods=20
|
||||||
|
|
||||||
|
# Maximum number of return / yield for function / method body
|
||||||
|
max-returns=6
|
||||||
|
|
||||||
|
# Maximum number of statements in function / method body
|
||||||
|
max-statements=50
|
||||||
|
|
||||||
|
# Minimum number of public methods for a class (see R0903).
|
||||||
|
min-public-methods=0
|
||||||
|
|
||||||
|
|
||||||
|
[EXCEPTIONS]
|
||||||
|
|
||||||
|
# Exceptions that will emit a warning when being caught. Defaults to
|
||||||
|
# "Exception"
|
||||||
|
overgeneral-exceptions=builtins.Exception
|
||||||
3
src/backend/MANIFEST.in
Normal file
3
src/backend/MANIFEST.in
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
include LICENSE
|
||||||
|
include README.md
|
||||||
|
recursive-include src/backend/publish *.html *.png *.gif *.css *.ico *.jpg *.jpeg *.po *.mo *.eot *.svg *.ttf *.woff *.woff2
|
||||||
0
src/backend/__init__.py
Normal file
0
src/backend/__init__.py
Normal file
0
src/backend/core/__init__.py
Normal file
0
src/backend/core/__init__.py
Normal file
37
src/backend/core/admin.py
Normal file
37
src/backend/core/admin.py
Normal file
@@ -0,0 +1,37 @@
|
|||||||
|
"""Admin classes and registrations for Magnify's core app."""
|
||||||
|
from django.contrib import admin
|
||||||
|
|
||||||
|
from . import models
|
||||||
|
|
||||||
|
|
||||||
|
class IdentityInline(admin.TabularInline):
|
||||||
|
"""Inline admin class for user identities."""
|
||||||
|
|
||||||
|
model = models.Identity
|
||||||
|
extra = 0
|
||||||
|
|
||||||
|
|
||||||
|
@admin.register(models.User)
|
||||||
|
class UserAdmin(admin.ModelAdmin):
|
||||||
|
"""User admin interface declaration."""
|
||||||
|
|
||||||
|
inlines = (IdentityInline,)
|
||||||
|
|
||||||
|
|
||||||
|
class TemplateAccessInline(admin.TabularInline):
|
||||||
|
"""Inline admin class for template accesses."""
|
||||||
|
|
||||||
|
model = models.TemplateAccess
|
||||||
|
extra = 0
|
||||||
|
|
||||||
|
|
||||||
|
@admin.register(models.Template)
|
||||||
|
class TemplateAdmin(admin.ModelAdmin):
|
||||||
|
"""Template admin interface declaration."""
|
||||||
|
|
||||||
|
inlines = (TemplateAccessInline,)
|
||||||
|
|
||||||
|
|
||||||
|
@admin.register(models.Team)
|
||||||
|
class TeamAdmin(admin.ModelAdmin):
|
||||||
|
"""Team admin interface declaration."""
|
||||||
39
src/backend/core/api/__init__.py
Normal file
39
src/backend/core/api/__init__.py
Normal file
@@ -0,0 +1,39 @@
|
|||||||
|
"""publish core API endpoints"""
|
||||||
|
from django.conf import settings
|
||||||
|
from django.core.exceptions import ValidationError
|
||||||
|
|
||||||
|
from rest_framework import exceptions as drf_exceptions
|
||||||
|
from rest_framework import views as drf_views
|
||||||
|
from rest_framework.decorators import api_view
|
||||||
|
from rest_framework.response import Response
|
||||||
|
|
||||||
|
|
||||||
|
def exception_handler(exc, context):
|
||||||
|
"""Handle Django ValidationError as an accepted exception.
|
||||||
|
|
||||||
|
For the parameters, see ``exception_handler``
|
||||||
|
This code comes from twidi's gist:
|
||||||
|
https://gist.github.com/twidi/9d55486c36b6a51bdcb05ce3a763e79f
|
||||||
|
"""
|
||||||
|
if isinstance(exc, ValidationError):
|
||||||
|
if hasattr(exc, "message_dict"):
|
||||||
|
detail = exc.message_dict
|
||||||
|
elif hasattr(exc, "message"):
|
||||||
|
detail = exc.message
|
||||||
|
elif hasattr(exc, "messages"):
|
||||||
|
detail = exc.messages
|
||||||
|
|
||||||
|
exc = drf_exceptions.ValidationError(detail=detail)
|
||||||
|
|
||||||
|
return drf_views.exception_handler(exc, context)
|
||||||
|
|
||||||
|
|
||||||
|
# pylint: disable=unused-argument
|
||||||
|
@api_view(["GET"])
|
||||||
|
def get_frontend_configuration(request):
|
||||||
|
"""Returns the frontend configuration dict as configured in settings."""
|
||||||
|
frontend_configuration = {
|
||||||
|
"LANGUAGE_CODE": settings.LANGUAGE_CODE,
|
||||||
|
}
|
||||||
|
frontend_configuration.update(settings.FRONTEND_CONFIGURATION)
|
||||||
|
return Response(frontend_configuration)
|
||||||
54
src/backend/core/api/permissions.py
Normal file
54
src/backend/core/api/permissions.py
Normal file
@@ -0,0 +1,54 @@
|
|||||||
|
"""Permission handlers for the publish core app."""
|
||||||
|
from django.core import exceptions
|
||||||
|
|
||||||
|
from rest_framework import permissions
|
||||||
|
|
||||||
|
|
||||||
|
class IsAuthenticated(permissions.BasePermission):
|
||||||
|
"""
|
||||||
|
Allows access only to authenticated users. Alternative method checking the presence
|
||||||
|
of the auth token to avoid hitting the database.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def has_permission(self, request, view):
|
||||||
|
return bool(request.auth) if request.auth else request.user.is_authenticated
|
||||||
|
|
||||||
|
|
||||||
|
class IsSelf(IsAuthenticated):
|
||||||
|
"""
|
||||||
|
Allows access only to authenticated users. Alternative method checking the presence
|
||||||
|
of the auth token to avoid hitting the database.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def has_object_permission(self, request, view, obj):
|
||||||
|
"""Write permissions are only allowed to the user itself."""
|
||||||
|
return obj == request.user
|
||||||
|
|
||||||
|
|
||||||
|
class IsOwnedOrPublic(IsAuthenticated):
|
||||||
|
"""
|
||||||
|
Allows access to authenticated users only for objects that are owned or not related
|
||||||
|
to any user via the "owner" field.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def has_object_permission(self, request, view, obj):
|
||||||
|
"""Unsafe permissions are only allowed for the owner of the object."""
|
||||||
|
if obj.owner == request.user:
|
||||||
|
return True
|
||||||
|
|
||||||
|
if request.method in permissions.SAFE_METHODS and obj.owner is None:
|
||||||
|
return True
|
||||||
|
|
||||||
|
try:
|
||||||
|
return obj.user == request.user
|
||||||
|
except exceptions.ObjectDoesNotExist:
|
||||||
|
return False
|
||||||
|
|
||||||
|
|
||||||
|
class AccessPermission(IsAuthenticated):
|
||||||
|
"""Permission class for access objects."""
|
||||||
|
|
||||||
|
def has_object_permission(self, request, view, obj):
|
||||||
|
"""Check permission for a given object."""
|
||||||
|
abilities = obj.get_abilities(request.user)
|
||||||
|
return abilities.get(request.method.lower(), False)
|
||||||
137
src/backend/core/api/serializers.py
Normal file
137
src/backend/core/api/serializers.py
Normal file
@@ -0,0 +1,137 @@
|
|||||||
|
"""Client serializers for the publish core app."""
|
||||||
|
from rest_framework import exceptions, serializers
|
||||||
|
from timezone_field.rest_framework import TimeZoneSerializerField
|
||||||
|
|
||||||
|
from core import models
|
||||||
|
|
||||||
|
|
||||||
|
class ContactSerializer(serializers.ModelSerializer):
|
||||||
|
"""Serialize contacts."""
|
||||||
|
|
||||||
|
class Meta:
|
||||||
|
model = models.Contact
|
||||||
|
fields = [
|
||||||
|
"id",
|
||||||
|
"base",
|
||||||
|
"data",
|
||||||
|
"full_name",
|
||||||
|
"owner",
|
||||||
|
"short_name",
|
||||||
|
]
|
||||||
|
read_only_fields = ["id", "owner"]
|
||||||
|
|
||||||
|
def update(self, instance, validated_data):
|
||||||
|
"""Make "base" field readonly but only for update/patch."""
|
||||||
|
validated_data.pop("base", None)
|
||||||
|
return super().update(instance, validated_data)
|
||||||
|
|
||||||
|
|
||||||
|
class UserSerializer(serializers.ModelSerializer):
|
||||||
|
"""Serialize users."""
|
||||||
|
|
||||||
|
timezone = TimeZoneSerializerField(use_pytz=False, required=True)
|
||||||
|
|
||||||
|
class Meta:
|
||||||
|
model = models.User
|
||||||
|
fields = [
|
||||||
|
"id",
|
||||||
|
"language",
|
||||||
|
"timezone",
|
||||||
|
"is_device",
|
||||||
|
"is_staff",
|
||||||
|
]
|
||||||
|
read_only_fields = ["id", "is_device", "is_staff"]
|
||||||
|
|
||||||
|
class TeamAccessSerializer(serializers.ModelSerializer):
|
||||||
|
"""Serialize team accesses."""
|
||||||
|
|
||||||
|
abilities = serializers.SerializerMethodField(read_only=True)
|
||||||
|
|
||||||
|
class Meta:
|
||||||
|
model = models.TeamAccess
|
||||||
|
fields = ["id", "user", "role", "abilities"]
|
||||||
|
read_only_fields = ["id", "abilities"]
|
||||||
|
|
||||||
|
def update(self, instance, validated_data):
|
||||||
|
"""Make "user" field is readonly but only on update."""
|
||||||
|
validated_data.pop("user", None)
|
||||||
|
return super().update(instance, validated_data)
|
||||||
|
|
||||||
|
def get_abilities(self, access) -> dict:
|
||||||
|
"""Return abilities of the logged-in user on the instance."""
|
||||||
|
request = self.context.get("request")
|
||||||
|
if request:
|
||||||
|
return access.get_abilities(request.user)
|
||||||
|
return {}
|
||||||
|
|
||||||
|
def validate(self, attrs):
|
||||||
|
"""
|
||||||
|
Check access rights specific to writing (create/update)
|
||||||
|
"""
|
||||||
|
request = self.context.get("request")
|
||||||
|
user = getattr(request, "user", None)
|
||||||
|
role = attrs.get("role")
|
||||||
|
|
||||||
|
# Update
|
||||||
|
if self.instance:
|
||||||
|
can_set_role_to = self.instance.get_abilities(user)["set_role_to"]
|
||||||
|
|
||||||
|
if role and role not in can_set_role_to:
|
||||||
|
message = (
|
||||||
|
f"You are only allowed to set role to {', '.join(can_set_role_to)}"
|
||||||
|
if can_set_role_to
|
||||||
|
else "You are not allowed to set this role for this team."
|
||||||
|
)
|
||||||
|
raise exceptions.PermissionDenied(message)
|
||||||
|
|
||||||
|
# Create
|
||||||
|
else:
|
||||||
|
try:
|
||||||
|
team_id = self.context["team_id"]
|
||||||
|
except KeyError as exc:
|
||||||
|
raise exceptions.ValidationError(
|
||||||
|
"You must set a team ID in kwargs to create a new team access."
|
||||||
|
) from exc
|
||||||
|
|
||||||
|
if not models.TeamAccess.objects.filter(
|
||||||
|
team=team_id,
|
||||||
|
user=user,
|
||||||
|
role__in=[models.RoleChoices.OWNER, models.RoleChoices.ADMIN],
|
||||||
|
).exists():
|
||||||
|
raise exceptions.PermissionDenied(
|
||||||
|
"You are not allowed to manage accesses for this team."
|
||||||
|
)
|
||||||
|
|
||||||
|
if (
|
||||||
|
role == models.RoleChoices.OWNER
|
||||||
|
and not models.TeamAccess.objects.filter(
|
||||||
|
team=team_id,
|
||||||
|
user=user,
|
||||||
|
role=models.RoleChoices.OWNER,
|
||||||
|
).exists()
|
||||||
|
):
|
||||||
|
raise exceptions.PermissionDenied(
|
||||||
|
"Only owners of a team can assign other users as owners."
|
||||||
|
)
|
||||||
|
|
||||||
|
attrs["team_id"] = self.context["team_id"]
|
||||||
|
return attrs
|
||||||
|
|
||||||
|
|
||||||
|
class TeamSerializer(serializers.ModelSerializer):
|
||||||
|
"""Serialize teams."""
|
||||||
|
|
||||||
|
abilities = serializers.SerializerMethodField(read_only=True)
|
||||||
|
accesses = TeamAccessSerializer(many=True, read_only=True)
|
||||||
|
|
||||||
|
class Meta:
|
||||||
|
model = models.Team
|
||||||
|
fields = ["id", "name", "accesses", "abilities"]
|
||||||
|
read_only_fields = ["id", "accesses", "abilities"]
|
||||||
|
|
||||||
|
def get_abilities(self, team) -> dict:
|
||||||
|
"""Return abilities of the logged-in user on the instance."""
|
||||||
|
request = self.context.get("request")
|
||||||
|
if request:
|
||||||
|
return team.get_abilities(request.user)
|
||||||
|
return {}
|
||||||
14
src/backend/core/api/utils.py
Normal file
14
src/backend/core/api/utils.py
Normal file
@@ -0,0 +1,14 @@
|
|||||||
|
"""
|
||||||
|
Utils that can be useful throughout the publish core app
|
||||||
|
"""
|
||||||
|
from rest_framework_simplejwt.tokens import RefreshToken
|
||||||
|
|
||||||
|
|
||||||
|
def get_tokens_for_user(user):
|
||||||
|
"""Get JWT tokens for user authentication."""
|
||||||
|
refresh = RefreshToken.for_user(user)
|
||||||
|
|
||||||
|
return {
|
||||||
|
"refresh": str(refresh),
|
||||||
|
"access": str(refresh.access_token),
|
||||||
|
}
|
||||||
349
src/backend/core/api/viewsets.py
Normal file
349
src/backend/core/api/viewsets.py
Normal file
@@ -0,0 +1,349 @@
|
|||||||
|
"""API endpoints"""
|
||||||
|
from django.contrib.postgres.search import TrigramSimilarity
|
||||||
|
from django.core.cache import cache
|
||||||
|
from django.db.models import (
|
||||||
|
Func,
|
||||||
|
OuterRef,
|
||||||
|
Q,
|
||||||
|
Subquery,
|
||||||
|
Value,
|
||||||
|
)
|
||||||
|
|
||||||
|
from rest_framework import (
|
||||||
|
decorators,
|
||||||
|
exceptions,
|
||||||
|
mixins,
|
||||||
|
pagination,
|
||||||
|
response,
|
||||||
|
viewsets,
|
||||||
|
)
|
||||||
|
|
||||||
|
from core import models
|
||||||
|
|
||||||
|
from . import permissions, serializers
|
||||||
|
|
||||||
|
|
||||||
|
class NestedGenericViewSet(viewsets.GenericViewSet):
|
||||||
|
"""
|
||||||
|
A generic Viewset aims to be used in a nested route context.
|
||||||
|
e.g: `/api/v1.0/resource_1/<resource_1_pk>/resource_2/<resource_2_pk>/`
|
||||||
|
|
||||||
|
It allows to define all url kwargs and lookup fields to perform the lookup.
|
||||||
|
"""
|
||||||
|
|
||||||
|
lookup_fields: list[str] = ["pk"]
|
||||||
|
lookup_url_kwargs: list[str] = []
|
||||||
|
|
||||||
|
def __getattribute__(self, item):
|
||||||
|
"""
|
||||||
|
This method is overridden to allow to get the last lookup field or lookup url kwarg
|
||||||
|
when accessing the `lookup_field` or `lookup_url_kwarg` attribute. This is useful
|
||||||
|
to keep compatibility with all methods used by the parent class `GenericViewSet`.
|
||||||
|
"""
|
||||||
|
if item in ["lookup_field", "lookup_url_kwarg"]:
|
||||||
|
return getattr(self, item + "s", [None])[-1]
|
||||||
|
|
||||||
|
return super().__getattribute__(item)
|
||||||
|
|
||||||
|
def get_queryset(self):
|
||||||
|
"""
|
||||||
|
Get the list of items for this view.
|
||||||
|
|
||||||
|
`lookup_fields` attribute is enumerated here to perform the nested lookup.
|
||||||
|
"""
|
||||||
|
queryset = super().get_queryset()
|
||||||
|
|
||||||
|
# The last lookup field is removed to perform the nested lookup as it corresponds
|
||||||
|
# to the object pk, it is used within get_object method.
|
||||||
|
lookup_url_kwargs = (
|
||||||
|
self.lookup_url_kwargs[:-1]
|
||||||
|
if self.lookup_url_kwargs
|
||||||
|
else self.lookup_fields[:-1]
|
||||||
|
)
|
||||||
|
|
||||||
|
filter_kwargs = {}
|
||||||
|
for index, lookup_url_kwarg in enumerate(lookup_url_kwargs):
|
||||||
|
if lookup_url_kwarg not in self.kwargs:
|
||||||
|
raise KeyError(
|
||||||
|
f"Expected view {self.__class__.__name__} to be called with a URL "
|
||||||
|
f'keyword argument named "{lookup_url_kwarg}". Fix your URL conf, or '
|
||||||
|
"set the `.lookup_fields` attribute on the view correctly."
|
||||||
|
)
|
||||||
|
|
||||||
|
filter_kwargs.update(
|
||||||
|
{self.lookup_fields[index]: self.kwargs[lookup_url_kwarg]}
|
||||||
|
)
|
||||||
|
|
||||||
|
return queryset.filter(**filter_kwargs)
|
||||||
|
|
||||||
|
|
||||||
|
class SerializerPerActionMixin:
|
||||||
|
"""
|
||||||
|
A mixin to allow to define serializer classes for each action.
|
||||||
|
|
||||||
|
This mixin is useful to avoid to define a serializer class for each action in the
|
||||||
|
`get_serializer_class` method.
|
||||||
|
"""
|
||||||
|
|
||||||
|
serializer_classes: dict[str, type] = {}
|
||||||
|
default_serializer_class: type = None
|
||||||
|
|
||||||
|
def get_serializer_class(self):
|
||||||
|
"""
|
||||||
|
Return the serializer class to use depending on the action.
|
||||||
|
"""
|
||||||
|
return self.serializer_classes.get(self.action, self.default_serializer_class)
|
||||||
|
|
||||||
|
|
||||||
|
class Pagination(pagination.PageNumberPagination):
|
||||||
|
"""Pagination to display no more than 100 objects per page sorted by creation date."""
|
||||||
|
|
||||||
|
ordering = "-created_on"
|
||||||
|
max_page_size = 100
|
||||||
|
page_size_query_param = "page_size"
|
||||||
|
|
||||||
|
|
||||||
|
# pylint: disable=too-many-ancestors
|
||||||
|
class ContactViewSet(
|
||||||
|
mixins.CreateModelMixin,
|
||||||
|
mixins.DestroyModelMixin,
|
||||||
|
mixins.RetrieveModelMixin,
|
||||||
|
mixins.UpdateModelMixin,
|
||||||
|
viewsets.GenericViewSet,
|
||||||
|
):
|
||||||
|
"""Contact ViewSet"""
|
||||||
|
|
||||||
|
permission_classes = [permissions.IsOwnedOrPublic]
|
||||||
|
queryset = models.Contact.objects.all()
|
||||||
|
serializer_class = serializers.ContactSerializer
|
||||||
|
|
||||||
|
def list(self, request, *args, **kwargs):
|
||||||
|
"""Limit listed users by a query with throttle protection."""
|
||||||
|
user = self.request.user
|
||||||
|
queryset = self.filter_queryset(self.get_queryset())
|
||||||
|
|
||||||
|
if not user.is_authenticated:
|
||||||
|
return queryset.none()
|
||||||
|
|
||||||
|
# Exclude contacts that:
|
||||||
|
queryset = queryset.filter(
|
||||||
|
# - belong to another user (keep public and owned contacts)
|
||||||
|
Q(owner__isnull=True) | Q(owner=user),
|
||||||
|
# - are profile contacts for a user
|
||||||
|
user__isnull=True,
|
||||||
|
# - are overriden base contacts
|
||||||
|
overriding_contacts__isnull=True,
|
||||||
|
)
|
||||||
|
|
||||||
|
# Search by case-insensitive and accent-insensitive trigram similarity
|
||||||
|
if query := self.request.GET.get("q", ""):
|
||||||
|
query = Func(Value(query), function="unaccent")
|
||||||
|
similarity = TrigramSimilarity(
|
||||||
|
Func("full_name", function="unaccent"),
|
||||||
|
query,
|
||||||
|
) + TrigramSimilarity(Func("short_name", function="unaccent"), query)
|
||||||
|
queryset = (
|
||||||
|
queryset.annotate(similarity=similarity)
|
||||||
|
.filter(
|
||||||
|
similarity__gte=0.05
|
||||||
|
) # Value determined by testing (test_api_contacts.py)
|
||||||
|
.order_by("-similarity")
|
||||||
|
)
|
||||||
|
|
||||||
|
# Throttle protection
|
||||||
|
key_base = f"throttle-contact-list-{user.id!s}"
|
||||||
|
key_minute = f"{key_base:s}-minute"
|
||||||
|
key_hour = f"{key_base:s}-hour"
|
||||||
|
|
||||||
|
try:
|
||||||
|
count_minute = cache.incr(key_minute)
|
||||||
|
except ValueError:
|
||||||
|
cache.set(key_minute, 1, 60)
|
||||||
|
count_minute = 1
|
||||||
|
|
||||||
|
try:
|
||||||
|
count_hour = cache.incr(key_hour)
|
||||||
|
except ValueError:
|
||||||
|
cache.set(key_hour, 1, 3600)
|
||||||
|
count_hour = 1
|
||||||
|
|
||||||
|
if count_minute > 20 or count_hour > 150:
|
||||||
|
raise exceptions.Throttled()
|
||||||
|
|
||||||
|
serializer = self.get_serializer(queryset, many=True)
|
||||||
|
return response.Response(serializer.data)
|
||||||
|
|
||||||
|
def perform_create(self, serializer):
|
||||||
|
"""Set the current user as owner of the newly created contact."""
|
||||||
|
user = self.request.user
|
||||||
|
serializer.validated_data["owner"] = user
|
||||||
|
return super().perform_create(serializer)
|
||||||
|
|
||||||
|
|
||||||
|
class UserViewSet(
|
||||||
|
mixins.UpdateModelMixin,
|
||||||
|
viewsets.GenericViewSet,
|
||||||
|
):
|
||||||
|
"""User ViewSet"""
|
||||||
|
|
||||||
|
permission_classes = [permissions.IsSelf]
|
||||||
|
queryset = models.User.objects.all()
|
||||||
|
serializer_class = serializers.UserSerializer
|
||||||
|
|
||||||
|
@decorators.action(
|
||||||
|
detail=False,
|
||||||
|
methods=["get"],
|
||||||
|
url_name="me",
|
||||||
|
url_path="me",
|
||||||
|
permission_classes=[permissions.IsAuthenticated],
|
||||||
|
)
|
||||||
|
def get_me(self, request):
|
||||||
|
"""
|
||||||
|
Return information on currently logged user
|
||||||
|
"""
|
||||||
|
context = {"request": request}
|
||||||
|
return response.Response(
|
||||||
|
self.serializer_class(request.user, context=context).data
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class TeamViewSet(
|
||||||
|
mixins.CreateModelMixin,
|
||||||
|
mixins.DestroyModelMixin,
|
||||||
|
mixins.ListModelMixin,
|
||||||
|
mixins.RetrieveModelMixin,
|
||||||
|
mixins.UpdateModelMixin,
|
||||||
|
viewsets.GenericViewSet,
|
||||||
|
):
|
||||||
|
"""Team ViewSet"""
|
||||||
|
|
||||||
|
permission_classes = [permissions.AccessPermission]
|
||||||
|
serializer_class = serializers.TeamSerializer
|
||||||
|
queryset = models.Team.objects.all()
|
||||||
|
|
||||||
|
def get_queryset(self):
|
||||||
|
"""Custom queryset to get user related teams."""
|
||||||
|
user_role_query = models.TeamAccess.objects.filter(
|
||||||
|
user=self.request.user, team=OuterRef("pk")
|
||||||
|
).values("role")[:1]
|
||||||
|
return models.Team.objects.filter(accesses__user=self.request.user).annotate(
|
||||||
|
user_role=Subquery(user_role_query)
|
||||||
|
)
|
||||||
|
|
||||||
|
def perform_create(self, serializer):
|
||||||
|
"""Set the current user as owner of the newly created team."""
|
||||||
|
team = serializer.save()
|
||||||
|
models.TeamAccess.objects.create(
|
||||||
|
team=team,
|
||||||
|
user=self.request.user,
|
||||||
|
role=models.RoleChoices.OWNER,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class TeamAccessViewSet(
|
||||||
|
mixins.CreateModelMixin,
|
||||||
|
mixins.DestroyModelMixin,
|
||||||
|
mixins.ListModelMixin,
|
||||||
|
mixins.RetrieveModelMixin,
|
||||||
|
mixins.UpdateModelMixin,
|
||||||
|
viewsets.GenericViewSet,
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
API ViewSet for all interactions with team accesses.
|
||||||
|
|
||||||
|
GET /api/v1.0/teams/<team_id>/accesses/:<team_access_id>
|
||||||
|
Return list of all team accesses related to the logged-in user or one
|
||||||
|
team access if an id is provided.
|
||||||
|
|
||||||
|
POST /api/v1.0/teams/<team_id>/accesses/ with expected data:
|
||||||
|
- user: str
|
||||||
|
- role: str [owner|admin|member]
|
||||||
|
Return newly created team access
|
||||||
|
|
||||||
|
PUT /api/v1.0/teams/<team_id>/accesses/<team_access_id>/ with expected data:
|
||||||
|
- role: str [owner|admin|member]
|
||||||
|
Return updated team access
|
||||||
|
|
||||||
|
PATCH /api/v1.0/teams/<team_id>/accesses/<team_access_id>/ with expected data:
|
||||||
|
- role: str [owner|admin|member]
|
||||||
|
Return partially updated team access
|
||||||
|
|
||||||
|
DELETE /api/v1.0/teams/<team_id>/accesses/<team_access_id>/
|
||||||
|
Delete targeted team access
|
||||||
|
"""
|
||||||
|
|
||||||
|
lookup_field = "pk"
|
||||||
|
pagination_class = Pagination
|
||||||
|
permission_classes = [permissions.AccessPermission]
|
||||||
|
queryset = models.TeamAccess.objects.all().select_related("user")
|
||||||
|
serializer_class = serializers.TeamAccessSerializer
|
||||||
|
|
||||||
|
def get_permissions(self):
|
||||||
|
"""User only needs to be authenticated to list team accesses"""
|
||||||
|
if self.action == "list":
|
||||||
|
permission_classes = [permissions.IsAuthenticated]
|
||||||
|
else:
|
||||||
|
return super().get_permissions()
|
||||||
|
|
||||||
|
return [permission() for permission in permission_classes]
|
||||||
|
|
||||||
|
def get_serializer_context(self):
|
||||||
|
"""Extra context provided to the serializer class."""
|
||||||
|
context = super().get_serializer_context()
|
||||||
|
context["team_id"] = self.kwargs["team_id"]
|
||||||
|
return context
|
||||||
|
|
||||||
|
def get_queryset(self):
|
||||||
|
"""Return the queryset according to the action."""
|
||||||
|
queryset = super().get_queryset()
|
||||||
|
queryset = queryset.filter(team=self.kwargs["team_id"])
|
||||||
|
|
||||||
|
if self.action == "list":
|
||||||
|
# Limit to team access instances related to a team THAT also has a team access
|
||||||
|
# instance for the logged-in user (we don't want to list only the team access
|
||||||
|
# instances pointing to the logged-in user)
|
||||||
|
user_role_query = models.TeamAccess.objects.filter(
|
||||||
|
team__accesses__user=self.request.user
|
||||||
|
).values("role")[:1]
|
||||||
|
queryset = (
|
||||||
|
queryset.filter(
|
||||||
|
team__accesses__user=self.request.user,
|
||||||
|
)
|
||||||
|
.annotate(user_role=Subquery(user_role_query))
|
||||||
|
.distinct()
|
||||||
|
)
|
||||||
|
return queryset
|
||||||
|
|
||||||
|
def destroy(self, request, *args, **kwargs):
|
||||||
|
"""Forbid deleting the last owner access"""
|
||||||
|
instance = self.get_object()
|
||||||
|
team = instance.team
|
||||||
|
|
||||||
|
# Check if the access being deleted is the last owner access for the team
|
||||||
|
if instance.role == "owner" and team.accesses.filter(role="owner").count() == 1:
|
||||||
|
return response.Response(
|
||||||
|
{"detail": "Cannot delete the last owner access for the team."},
|
||||||
|
status=400,
|
||||||
|
)
|
||||||
|
|
||||||
|
return super().destroy(request, *args, **kwargs)
|
||||||
|
|
||||||
|
def perform_update(self, serializer):
|
||||||
|
"""Check that we don't change the role if it leads to losing the last owner."""
|
||||||
|
instance = serializer.instance
|
||||||
|
|
||||||
|
# Check if the role is being updated and the new role is not "owner"
|
||||||
|
if (
|
||||||
|
"role" in self.request.data
|
||||||
|
and self.request.data["role"] != models.RoleChoices.OWNER
|
||||||
|
):
|
||||||
|
team = instance.team
|
||||||
|
# Check if the access being updated is the last owner access for the team
|
||||||
|
if (
|
||||||
|
instance.role == models.RoleChoices.OWNER
|
||||||
|
and team.accesses.filter(role=models.RoleChoices.OWNER).count() == 1
|
||||||
|
):
|
||||||
|
message = "Cannot change the role to a non-owner role for the last owner access."
|
||||||
|
raise exceptions.ValidationError({"role": message})
|
||||||
|
|
||||||
|
serializer.save()
|
||||||
11
src/backend/core/apps.py
Normal file
11
src/backend/core/apps.py
Normal file
@@ -0,0 +1,11 @@
|
|||||||
|
"""publish Core application"""
|
||||||
|
# from django.apps import AppConfig
|
||||||
|
# from django.utils.translation import gettext_lazy as _
|
||||||
|
|
||||||
|
|
||||||
|
# class CoreConfig(AppConfig):
|
||||||
|
# """Configuration class for the publish core app."""
|
||||||
|
|
||||||
|
# name = "core"
|
||||||
|
# app_label = "core"
|
||||||
|
# verbose_name = _("publish core application")
|
||||||
59
src/backend/core/authentication.py
Normal file
59
src/backend/core/authentication.py
Normal file
@@ -0,0 +1,59 @@
|
|||||||
|
"""Authentication for the publish core app."""
|
||||||
|
from django.conf import settings
|
||||||
|
from django.utils.functional import SimpleLazyObject
|
||||||
|
from django.utils.module_loading import import_string
|
||||||
|
from django.utils.translation import gettext_lazy as _
|
||||||
|
|
||||||
|
from drf_spectacular.authentication import SessionScheme, TokenScheme
|
||||||
|
from drf_spectacular.plumbing import build_bearer_security_scheme_object
|
||||||
|
from rest_framework import authentication
|
||||||
|
from rest_framework_simplejwt.authentication import JWTAuthentication
|
||||||
|
|
||||||
|
|
||||||
|
class DelegatedJWTAuthentication(JWTAuthentication):
|
||||||
|
"""Override JWTAuthentication to create missing users on the fly."""
|
||||||
|
|
||||||
|
def get_user(self, validated_token):
|
||||||
|
"""
|
||||||
|
Return the user related to the given validated token, creating or updating it if necessary.
|
||||||
|
"""
|
||||||
|
get_user = import_string(settings.JWT_USER_GETTER)
|
||||||
|
return SimpleLazyObject(lambda: get_user(validated_token))
|
||||||
|
|
||||||
|
|
||||||
|
class OpenApiJWTAuthenticationExtension(TokenScheme):
|
||||||
|
"""Extension for specifying JWT authentication schemes."""
|
||||||
|
|
||||||
|
target_class = "core.authentication.DelegatedJWTAuthentication"
|
||||||
|
name = "DelegatedJWTAuthentication"
|
||||||
|
|
||||||
|
def get_security_definition(self, auto_schema):
|
||||||
|
"""Return the security definition for JWT authentication."""
|
||||||
|
return build_bearer_security_scheme_object(
|
||||||
|
header_name="Authorization",
|
||||||
|
token_prefix="Bearer", # noqa S106
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class SessionAuthenticationWithAuthenticateHeader(authentication.SessionAuthentication):
|
||||||
|
"""
|
||||||
|
This class is needed, because REST Framework's default SessionAuthentication does
|
||||||
|
never return 401's, because they cannot fill the WWW-Authenticate header with a
|
||||||
|
valid value in the 401 response. As a result, we cannot distinguish calls that are
|
||||||
|
not unauthorized (401 unauthorized) and calls for which the user does not have
|
||||||
|
permission (403 forbidden).
|
||||||
|
See https://github.com/encode/django-rest-framework/issues/5968
|
||||||
|
|
||||||
|
We do set authenticate_header function in SessionAuthentication, so that a value
|
||||||
|
for the WWW-Authenticate header can be retrieved and the response code is
|
||||||
|
automatically set to 401 in case of unauthenticated requests.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def authenticate_header(self, request):
|
||||||
|
return "Session"
|
||||||
|
|
||||||
|
|
||||||
|
class OpenApiSessionAuthenticationExtension(SessionScheme):
|
||||||
|
"""Extension for specifying session authentication schemes."""
|
||||||
|
|
||||||
|
target_class = "core.api.authentication.SessionAuthenticationWithAuthenticateHeader"
|
||||||
15
src/backend/core/enums.py
Normal file
15
src/backend/core/enums.py
Normal file
@@ -0,0 +1,15 @@
|
|||||||
|
"""
|
||||||
|
Core application enums declaration
|
||||||
|
"""
|
||||||
|
from django.conf import global_settings, settings
|
||||||
|
from django.utils.translation import gettext_lazy as _
|
||||||
|
|
||||||
|
# Django sets `LANGUAGES` by default with all supported languages. We can use it for
|
||||||
|
# the choice of languages which should not be limited to the few languages active in
|
||||||
|
# the app.
|
||||||
|
# pylint: disable=no-member
|
||||||
|
ALL_LANGUAGES = getattr(
|
||||||
|
settings,
|
||||||
|
"ALL_LANGUAGES",
|
||||||
|
[(language, _(name)) for language, name in global_settings.LANGUAGES],
|
||||||
|
)
|
||||||
66
src/backend/core/factories.py
Normal file
66
src/backend/core/factories.py
Normal file
@@ -0,0 +1,66 @@
|
|||||||
|
# ruff: noqa: S311
|
||||||
|
"""
|
||||||
|
Core application factories
|
||||||
|
"""
|
||||||
|
from django.conf import settings
|
||||||
|
from django.contrib.auth.hashers import make_password
|
||||||
|
|
||||||
|
import factory.fuzzy
|
||||||
|
from faker import Faker
|
||||||
|
|
||||||
|
from core import models
|
||||||
|
|
||||||
|
fake = Faker()
|
||||||
|
|
||||||
|
|
||||||
|
class UserFactory(factory.django.DjangoModelFactory):
|
||||||
|
"""A factory to random users for testing purposes."""
|
||||||
|
|
||||||
|
class Meta:
|
||||||
|
model = models.User
|
||||||
|
|
||||||
|
language = factory.fuzzy.FuzzyChoice([lang[0] for lang in settings.LANGUAGES])
|
||||||
|
password = make_password("password")
|
||||||
|
|
||||||
|
|
||||||
|
class IdentityFactory(factory.django.DjangoModelFactory):
|
||||||
|
"""A factory to create identities for a user"""
|
||||||
|
|
||||||
|
class Meta:
|
||||||
|
model = models.Identity
|
||||||
|
django_get_or_create = ("sub",)
|
||||||
|
|
||||||
|
user = factory.SubFactory(UserFactory)
|
||||||
|
sub = factory.Sequence(lambda n: f"user{n!s}")
|
||||||
|
email = factory.Faker("email")
|
||||||
|
|
||||||
|
|
||||||
|
class TeamFactory(factory.django.DjangoModelFactory):
|
||||||
|
"""A factory to create teams"""
|
||||||
|
|
||||||
|
class Meta:
|
||||||
|
model = models.Team
|
||||||
|
django_get_or_create = ("name",)
|
||||||
|
|
||||||
|
name = factory.Sequence(lambda n: f"team{n}")
|
||||||
|
|
||||||
|
@factory.post_generation
|
||||||
|
def users(self, create, extracted, **kwargs):
|
||||||
|
"""Add users to team from a given list of users with or without roles."""
|
||||||
|
if create and extracted:
|
||||||
|
for item in extracted:
|
||||||
|
if isinstance(item, models.User):
|
||||||
|
TeamAccessFactory(team=self, user=item)
|
||||||
|
else:
|
||||||
|
TeamAccessFactory(team=self, user=item[0], role=item[1])
|
||||||
|
|
||||||
|
|
||||||
|
class TeamAccessFactory(factory.django.DjangoModelFactory):
|
||||||
|
"""Create fake team user accesses for testing."""
|
||||||
|
|
||||||
|
class Meta:
|
||||||
|
model = models.TeamAccess
|
||||||
|
|
||||||
|
team = factory.SubFactory(TeamFactory)
|
||||||
|
user = factory.SubFactory(UserFactory)
|
||||||
|
role = factory.fuzzy.FuzzyChoice([r[0] for r in models.RoleChoices.choices])
|
||||||
8
src/backend/core/forms.py
Normal file
8
src/backend/core/forms.py
Normal file
@@ -0,0 +1,8 @@
|
|||||||
|
"""Forms for the core app of Publish"""
|
||||||
|
from django import forms
|
||||||
|
from .models import Template
|
||||||
|
|
||||||
|
class DocumentGenerationForm(forms.Form):
|
||||||
|
body = forms.CharField(widget=forms.Textarea, label="Markdown Body")
|
||||||
|
template = forms.ModelChoiceField(queryset=Template.objects.all(), label="Choose Template")
|
||||||
|
|
||||||
129
src/backend/core/migrations/0001_initial.py
Normal file
129
src/backend/core/migrations/0001_initial.py
Normal file
@@ -0,0 +1,129 @@
|
|||||||
|
# Generated by Django 5.0 on 2024-01-06 18:05
|
||||||
|
|
||||||
|
import django.contrib.auth.models
|
||||||
|
import django.core.validators
|
||||||
|
import django.db.models.deletion
|
||||||
|
import timezone_field.fields
|
||||||
|
import uuid
|
||||||
|
from django.conf import settings
|
||||||
|
from django.db import migrations, models
|
||||||
|
|
||||||
|
|
||||||
|
class Migration(migrations.Migration):
|
||||||
|
|
||||||
|
initial = True
|
||||||
|
|
||||||
|
dependencies = [
|
||||||
|
('auth', '0012_alter_user_first_name_max_length'),
|
||||||
|
]
|
||||||
|
|
||||||
|
operations = [
|
||||||
|
migrations.CreateModel(
|
||||||
|
name='Team',
|
||||||
|
fields=[
|
||||||
|
('id', models.UUIDField(default=uuid.uuid4, editable=False, help_text='primary key for the record as UUID', primary_key=True, serialize=False, verbose_name='id')),
|
||||||
|
('created_at', models.DateTimeField(auto_now_add=True, help_text='date and time at which a record was created', verbose_name='created on')),
|
||||||
|
('updated_at', models.DateTimeField(auto_now=True, help_text='date and time at which a record was last updated', verbose_name='updated on')),
|
||||||
|
('name', models.CharField(max_length=100)),
|
||||||
|
],
|
||||||
|
options={
|
||||||
|
'verbose_name': 'Team',
|
||||||
|
'verbose_name_plural': 'Teams',
|
||||||
|
'db_table': 'publish_role',
|
||||||
|
'ordering': ('name',),
|
||||||
|
},
|
||||||
|
),
|
||||||
|
migrations.CreateModel(
|
||||||
|
name='Template',
|
||||||
|
fields=[
|
||||||
|
('id', models.UUIDField(default=uuid.uuid4, editable=False, help_text='primary key for the record as UUID', primary_key=True, serialize=False, verbose_name='id')),
|
||||||
|
('created_at', models.DateTimeField(auto_now_add=True, help_text='date and time at which a record was created', verbose_name='created on')),
|
||||||
|
('updated_at', models.DateTimeField(auto_now=True, help_text='date and time at which a record was last updated', verbose_name='updated on')),
|
||||||
|
('title', models.CharField(max_length=255, verbose_name='title')),
|
||||||
|
('description', models.TextField(blank=True, verbose_name='description')),
|
||||||
|
('code', models.TextField(blank=True, verbose_name='code')),
|
||||||
|
('css', models.TextField(blank=True, verbose_name='css')),
|
||||||
|
('is_public', models.BooleanField(default=False, help_text='Whether this template is public for anyone to use.', verbose_name='public')),
|
||||||
|
],
|
||||||
|
options={
|
||||||
|
'verbose_name': 'Template',
|
||||||
|
'verbose_name_plural': 'Templates',
|
||||||
|
'db_table': 'publish_template',
|
||||||
|
'ordering': ('title',),
|
||||||
|
},
|
||||||
|
),
|
||||||
|
migrations.CreateModel(
|
||||||
|
name='User',
|
||||||
|
fields=[
|
||||||
|
('password', models.CharField(max_length=128, verbose_name='password')),
|
||||||
|
('last_login', models.DateTimeField(blank=True, null=True, verbose_name='last login')),
|
||||||
|
('is_superuser', models.BooleanField(default=False, help_text='Designates that this user has all permissions without explicitly assigning them.', verbose_name='superuser status')),
|
||||||
|
('id', models.UUIDField(default=uuid.uuid4, editable=False, help_text='primary key for the record as UUID', primary_key=True, serialize=False, verbose_name='id')),
|
||||||
|
('created_at', models.DateTimeField(auto_now_add=True, help_text='date and time at which a record was created', verbose_name='created on')),
|
||||||
|
('updated_at', models.DateTimeField(auto_now=True, help_text='date and time at which a record was last updated', verbose_name='updated on')),
|
||||||
|
('email', models.EmailField(max_length=254, unique=True, verbose_name='email address')),
|
||||||
|
('language', models.CharField(choices="(('en-us', 'English'), ('fr-fr', 'French'))", default='en-us', help_text='The language in which the user wants to see the interface.', max_length=10, verbose_name='language')),
|
||||||
|
('timezone', timezone_field.fields.TimeZoneField(choices_display='WITH_GMT_OFFSET', default='UTC', help_text='The timezone in which the user wants to see times.', use_pytz=False)),
|
||||||
|
('is_device', models.BooleanField(default=False, help_text='Whether the user is a device or a real user.', verbose_name='device')),
|
||||||
|
('is_staff', models.BooleanField(default=False, help_text='Whether the user can log into this admin site.', verbose_name='staff status')),
|
||||||
|
('is_active', models.BooleanField(default=True, help_text='Whether this user should be treated as active. Unselect this instead of deleting accounts.', verbose_name='active')),
|
||||||
|
('groups', models.ManyToManyField(blank=True, help_text='The groups this user belongs to. A user will get all permissions granted to each of their groups.', related_name='user_set', related_query_name='user', to='auth.group', verbose_name='groups')),
|
||||||
|
('user_permissions', models.ManyToManyField(blank=True, help_text='Specific permissions for this user.', related_name='user_set', related_query_name='user', to='auth.permission', verbose_name='user permissions')),
|
||||||
|
],
|
||||||
|
options={
|
||||||
|
'verbose_name': 'user',
|
||||||
|
'verbose_name_plural': 'users',
|
||||||
|
'db_table': 'publish_user',
|
||||||
|
},
|
||||||
|
managers=[
|
||||||
|
('objects', django.contrib.auth.models.UserManager()),
|
||||||
|
],
|
||||||
|
),
|
||||||
|
migrations.CreateModel(
|
||||||
|
name='Identity',
|
||||||
|
fields=[
|
||||||
|
('id', models.UUIDField(default=uuid.uuid4, editable=False, help_text='primary key for the record as UUID', primary_key=True, serialize=False, verbose_name='id')),
|
||||||
|
('created_at', models.DateTimeField(auto_now_add=True, help_text='date and time at which a record was created', verbose_name='created on')),
|
||||||
|
('updated_at', models.DateTimeField(auto_now=True, help_text='date and time at which a record was last updated', verbose_name='updated on')),
|
||||||
|
('sub', models.CharField(blank=True, help_text='Required. 255 characters or fewer. Letters, numbers, and @/./+/-/_ characters only.', max_length=255, null=True, unique=True, validators=[django.core.validators.RegexValidator(message='Enter a valid sub. This value may contain only letters, numbers, and @/./+/-/_ characters.', regex='^[\\w.@+-]+\\Z')], verbose_name='sub')),
|
||||||
|
('email', models.EmailField(max_length=254, verbose_name='email address')),
|
||||||
|
('is_main', models.BooleanField(default=False, help_text='Designates whether the email is the main one.', verbose_name='main')),
|
||||||
|
('user', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='identities', to=settings.AUTH_USER_MODEL)),
|
||||||
|
],
|
||||||
|
options={
|
||||||
|
'verbose_name': 'identity',
|
||||||
|
'verbose_name_plural': 'identities',
|
||||||
|
'db_table': 'publish_identity',
|
||||||
|
'ordering': ('-is_main', 'email'),
|
||||||
|
},
|
||||||
|
),
|
||||||
|
migrations.CreateModel(
|
||||||
|
name='TemplateAccess',
|
||||||
|
fields=[
|
||||||
|
('id', models.UUIDField(default=uuid.uuid4, editable=False, help_text='primary key for the record as UUID', primary_key=True, serialize=False, verbose_name='id')),
|
||||||
|
('created_at', models.DateTimeField(auto_now_add=True, help_text='date and time at which a record was created', verbose_name='created on')),
|
||||||
|
('updated_at', models.DateTimeField(auto_now=True, help_text='date and time at which a record was last updated', verbose_name='updated on')),
|
||||||
|
('role', models.CharField(choices=[('member', 'Member'), ('administrator', 'Administrator'), ('owner', 'Owner')], default='member', max_length=20)),
|
||||||
|
('team', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='accesses', to='core.team')),
|
||||||
|
('template', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='accesses', to='core.template')),
|
||||||
|
('user', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='accesses', to=settings.AUTH_USER_MODEL)),
|
||||||
|
],
|
||||||
|
options={
|
||||||
|
'verbose_name': 'Template/user relation',
|
||||||
|
'verbose_name_plural': 'Template/user relations',
|
||||||
|
'db_table': 'publish_template_access',
|
||||||
|
},
|
||||||
|
),
|
||||||
|
migrations.AddConstraint(
|
||||||
|
model_name='identity',
|
||||||
|
constraint=models.UniqueConstraint(fields=('user', 'email'), name='unique_user_email', violation_error_message='This email address is already declared for this user.'),
|
||||||
|
),
|
||||||
|
migrations.AddConstraint(
|
||||||
|
model_name='templateaccess',
|
||||||
|
constraint=models.UniqueConstraint(fields=('user', 'template'), name='unique_template_user', violation_error_message='This user is already in this template.'),
|
||||||
|
),
|
||||||
|
migrations.AddConstraint(
|
||||||
|
model_name='templateaccess',
|
||||||
|
constraint=models.UniqueConstraint(fields=('team', 'template'), name='unique_template_team', violation_error_message='This team is already in this template.'),
|
||||||
|
),
|
||||||
|
]
|
||||||
0
src/backend/core/migrations/__init__.py
Normal file
0
src/backend/core/migrations/__init__.py
Normal file
393
src/backend/core/models.py
Normal file
393
src/backend/core/models.py
Normal file
@@ -0,0 +1,393 @@
|
|||||||
|
"""
|
||||||
|
Declare and configure the models for the publish core application
|
||||||
|
"""
|
||||||
|
import json
|
||||||
|
import os
|
||||||
|
import textwrap
|
||||||
|
import uuid
|
||||||
|
|
||||||
|
from django.conf import settings
|
||||||
|
from django.contrib.auth import models as auth_models
|
||||||
|
from django.contrib.auth.base_user import AbstractBaseUser
|
||||||
|
from django.core import exceptions, mail, validators
|
||||||
|
from django.db import models
|
||||||
|
from django.utils.functional import lazy
|
||||||
|
from django.utils.translation import gettext_lazy as _
|
||||||
|
|
||||||
|
from django.template.base import Template as DjangoTemplate
|
||||||
|
from django.template.context import Context
|
||||||
|
from django.template.engine import Engine
|
||||||
|
|
||||||
|
import markdown
|
||||||
|
from weasyprint import CSS, HTML
|
||||||
|
from weasyprint.text.fonts import FontConfiguration
|
||||||
|
|
||||||
|
import jsonschema
|
||||||
|
from rest_framework_simplejwt.exceptions import InvalidToken
|
||||||
|
from rest_framework_simplejwt.settings import api_settings
|
||||||
|
from timezone_field import TimeZoneField
|
||||||
|
|
||||||
|
|
||||||
|
class RoleChoices(models.TextChoices):
|
||||||
|
"""Defines the possible roles a user can have in a template."""
|
||||||
|
|
||||||
|
MEMBER = "member", _("Member")
|
||||||
|
ADMIN = "administrator", _("Administrator")
|
||||||
|
OWNER = "owner", _("Owner")
|
||||||
|
|
||||||
|
|
||||||
|
class BaseModel(models.Model):
|
||||||
|
"""
|
||||||
|
Serves as an abstract base model for other models, ensuring that records are validated
|
||||||
|
before saving as Django doesn't do it by default.
|
||||||
|
|
||||||
|
Includes fields common to all models: a UUID primary key and creation/update timestamps.
|
||||||
|
"""
|
||||||
|
|
||||||
|
id = models.UUIDField(
|
||||||
|
verbose_name=_("id"),
|
||||||
|
help_text=_("primary key for the record as UUID"),
|
||||||
|
primary_key=True,
|
||||||
|
default=uuid.uuid4,
|
||||||
|
editable=False,
|
||||||
|
)
|
||||||
|
created_at = models.DateTimeField(
|
||||||
|
verbose_name=_("created on"),
|
||||||
|
help_text=_("date and time at which a record was created"),
|
||||||
|
auto_now_add=True,
|
||||||
|
editable=False,
|
||||||
|
)
|
||||||
|
updated_at = models.DateTimeField(
|
||||||
|
verbose_name=_("updated on"),
|
||||||
|
help_text=_("date and time at which a record was last updated"),
|
||||||
|
auto_now=True,
|
||||||
|
editable=False,
|
||||||
|
)
|
||||||
|
|
||||||
|
class Meta:
|
||||||
|
abstract = True
|
||||||
|
|
||||||
|
def save(self, *args, **kwargs):
|
||||||
|
"""Call `full_clean` before saving."""
|
||||||
|
self.full_clean()
|
||||||
|
super().save(*args, **kwargs)
|
||||||
|
|
||||||
|
|
||||||
|
class User(AbstractBaseUser, BaseModel, auth_models.PermissionsMixin):
|
||||||
|
"""User model to work with OIDC only authentication."""
|
||||||
|
|
||||||
|
email = models.EmailField(_("email address"), unique=True)
|
||||||
|
language = models.CharField(
|
||||||
|
max_length=10,
|
||||||
|
choices=lazy(lambda: settings.LANGUAGES, tuple)(),
|
||||||
|
default=settings.LANGUAGE_CODE,
|
||||||
|
verbose_name=_("language"),
|
||||||
|
help_text=_("The language in which the user wants to see the interface."),
|
||||||
|
)
|
||||||
|
timezone = TimeZoneField(
|
||||||
|
choices_display="WITH_GMT_OFFSET",
|
||||||
|
use_pytz=False,
|
||||||
|
default=settings.TIME_ZONE,
|
||||||
|
help_text=_("The timezone in which the user wants to see times."),
|
||||||
|
)
|
||||||
|
is_device = models.BooleanField(
|
||||||
|
_("device"),
|
||||||
|
default=False,
|
||||||
|
help_text=_("Whether the user is a device or a real user."),
|
||||||
|
)
|
||||||
|
is_staff = models.BooleanField(
|
||||||
|
_("staff status"),
|
||||||
|
default=False,
|
||||||
|
help_text=_("Whether the user can log into this admin site."),
|
||||||
|
)
|
||||||
|
is_active = models.BooleanField(
|
||||||
|
_("active"),
|
||||||
|
default=True,
|
||||||
|
help_text=_(
|
||||||
|
"Whether this user should be treated as active. "
|
||||||
|
"Unselect this instead of deleting accounts."
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
objects = auth_models.UserManager()
|
||||||
|
|
||||||
|
USERNAME_FIELD = "email"
|
||||||
|
REQUIRED_FIELDS = []
|
||||||
|
|
||||||
|
class Meta:
|
||||||
|
db_table = "publish_user"
|
||||||
|
verbose_name = _("user")
|
||||||
|
verbose_name_plural = _("users")
|
||||||
|
|
||||||
|
|
||||||
|
class Identity(BaseModel):
|
||||||
|
"""User identity"""
|
||||||
|
|
||||||
|
sub_validator = validators.RegexValidator(
|
||||||
|
regex=r"^[\w.@+-]+\Z",
|
||||||
|
message=_(
|
||||||
|
"Enter a valid sub. This value may contain only letters, "
|
||||||
|
"numbers, and @/./+/-/_ characters."
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
user = models.ForeignKey(User, related_name="identities", on_delete=models.CASCADE)
|
||||||
|
sub = models.CharField(
|
||||||
|
_("sub"),
|
||||||
|
help_text=_(
|
||||||
|
"Required. 255 characters or fewer. Letters, numbers, and @/./+/-/_ characters only."
|
||||||
|
),
|
||||||
|
max_length=255,
|
||||||
|
unique=True,
|
||||||
|
validators=[sub_validator],
|
||||||
|
blank=True,
|
||||||
|
null=True,
|
||||||
|
)
|
||||||
|
email = models.EmailField(_("email address"))
|
||||||
|
is_main = models.BooleanField(
|
||||||
|
_("main"),
|
||||||
|
default=False,
|
||||||
|
help_text=_("Designates whether the email is the main one."),
|
||||||
|
)
|
||||||
|
|
||||||
|
class Meta:
|
||||||
|
db_table = "publish_identity"
|
||||||
|
ordering = ("-is_main", "email")
|
||||||
|
verbose_name = _("identity")
|
||||||
|
verbose_name_plural = _("identities")
|
||||||
|
constraints = [
|
||||||
|
# Uniqueness
|
||||||
|
models.UniqueConstraint(
|
||||||
|
fields=["user", "email"],
|
||||||
|
name="unique_user_email",
|
||||||
|
violation_error_message=_(
|
||||||
|
"This email address is already declared for this user."
|
||||||
|
),
|
||||||
|
),
|
||||||
|
]
|
||||||
|
|
||||||
|
def __str__(self):
|
||||||
|
main_str = "[main]" if self.is_main else ""
|
||||||
|
return f"{self.email:s}{main_str:s}"
|
||||||
|
|
||||||
|
def clean(self):
|
||||||
|
"""Normalize the email field and clean the 'is_main' field."""
|
||||||
|
self.email = User.objects.normalize_email(self.email)
|
||||||
|
if not self.user.identities.exclude(pk=self.pk).filter(is_main=True).exists():
|
||||||
|
if not self.created_at:
|
||||||
|
self.is_main = True
|
||||||
|
elif not self.is_main:
|
||||||
|
raise exceptions.ValidationError(
|
||||||
|
{"is_main": "A user should have one and only one main identity."}
|
||||||
|
)
|
||||||
|
super().clean()
|
||||||
|
|
||||||
|
def save(self, *args, **kwargs):
|
||||||
|
"""Ensure users always have one and only one main identity."""
|
||||||
|
super().save(*args, **kwargs)
|
||||||
|
if self.is_main is True:
|
||||||
|
self.user.identities.exclude(id=self.id).update(is_main=False)
|
||||||
|
|
||||||
|
|
||||||
|
class Team(BaseModel):
|
||||||
|
"""Team used for role based access control when matched with teams in OIDC tokens."""
|
||||||
|
|
||||||
|
name = models.CharField(max_length=100)
|
||||||
|
|
||||||
|
class Meta:
|
||||||
|
db_table = "publish_role"
|
||||||
|
ordering = ("name",)
|
||||||
|
verbose_name = _("Team")
|
||||||
|
verbose_name_plural = _("Teams")
|
||||||
|
|
||||||
|
def __str__(self):
|
||||||
|
return self.name
|
||||||
|
|
||||||
|
|
||||||
|
class Template(BaseModel):
|
||||||
|
"""HTML and CSS code used for formatting the print around the MarkDown body."""
|
||||||
|
|
||||||
|
title = models.CharField(_("title"), max_length=255)
|
||||||
|
description = models.TextField(_("description"), blank=True)
|
||||||
|
code = models.TextField(_("code"), blank=True)
|
||||||
|
css = models.TextField(_("css"), blank=True)
|
||||||
|
is_public = models.BooleanField(
|
||||||
|
_("public"),
|
||||||
|
default=False,
|
||||||
|
help_text=_("Whether this template is public for anyone to use."),
|
||||||
|
)
|
||||||
|
|
||||||
|
class Meta:
|
||||||
|
db_table = "publish_template"
|
||||||
|
ordering = ("title",)
|
||||||
|
verbose_name = _("Template")
|
||||||
|
verbose_name_plural = _("Templates")
|
||||||
|
|
||||||
|
def __str__(self):
|
||||||
|
return self.title
|
||||||
|
|
||||||
|
|
||||||
|
if not self.body:
|
||||||
|
return ""
|
||||||
|
return markdown.markdown(textwrap.dedent(self.body))
|
||||||
|
|
||||||
|
def generate_document(self, body):
|
||||||
|
"""
|
||||||
|
Generate and return a PDF document for this template around the
|
||||||
|
markdown body passed as argument.
|
||||||
|
"""
|
||||||
|
body_html = markdown.markdown(textwrap.dedent(body)) if body else ""
|
||||||
|
document_html = HTML(string=DjangoTemplate(self.code).render(Context({"body": body_html})))
|
||||||
|
css = CSS(
|
||||||
|
string=self.css,
|
||||||
|
font_config=FontConfiguration(),
|
||||||
|
)
|
||||||
|
return document_html.write_pdf(stylesheets=[css], zoom=1)
|
||||||
|
|
||||||
|
|
||||||
|
def get_abilities(self, user):
|
||||||
|
"""
|
||||||
|
Compute and return abilities for a given user on the template.
|
||||||
|
"""
|
||||||
|
is_owner_or_admin = False
|
||||||
|
role = None
|
||||||
|
|
||||||
|
if user.is_authenticated:
|
||||||
|
try:
|
||||||
|
role = self.user_role
|
||||||
|
except AttributeError:
|
||||||
|
try:
|
||||||
|
role = self.accesses.filter(user=user).values("role")[0]["role"]
|
||||||
|
except (TemplateAccess.DoesNotExist, IndexError):
|
||||||
|
role = None
|
||||||
|
|
||||||
|
is_owner_or_admin = role in [RoleChoices.OWNER, RoleChoices.ADMIN]
|
||||||
|
|
||||||
|
return {
|
||||||
|
"get": True,
|
||||||
|
"patch": is_owner_or_admin,
|
||||||
|
"put": is_owner_or_admin,
|
||||||
|
"delete": role == RoleChoices.OWNER,
|
||||||
|
"manage_accesses": is_owner_or_admin,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
class TemplateAccess(BaseModel):
|
||||||
|
"""Link table between templates and contacts."""
|
||||||
|
|
||||||
|
template = models.ForeignKey(
|
||||||
|
Template,
|
||||||
|
on_delete=models.CASCADE,
|
||||||
|
related_name="accesses",
|
||||||
|
)
|
||||||
|
user = models.ForeignKey(
|
||||||
|
User,
|
||||||
|
on_delete=models.CASCADE,
|
||||||
|
related_name="accesses",
|
||||||
|
null=True,
|
||||||
|
blank=True,
|
||||||
|
)
|
||||||
|
team = models.ForeignKey(
|
||||||
|
Team,
|
||||||
|
on_delete=models.CASCADE,
|
||||||
|
related_name="accesses",
|
||||||
|
null=True,
|
||||||
|
blank=True,
|
||||||
|
)
|
||||||
|
role = models.CharField(
|
||||||
|
max_length=20, choices=RoleChoices.choices, default=RoleChoices.MEMBER
|
||||||
|
)
|
||||||
|
|
||||||
|
class Meta:
|
||||||
|
db_table = "publish_template_access"
|
||||||
|
verbose_name = _("Template/user relation")
|
||||||
|
verbose_name_plural = _("Template/user relations")
|
||||||
|
constraints = [
|
||||||
|
models.UniqueConstraint(
|
||||||
|
fields=["user", "template"],
|
||||||
|
name="unique_template_user",
|
||||||
|
violation_error_message=_("This user is already in this template."),
|
||||||
|
),
|
||||||
|
models.UniqueConstraint(
|
||||||
|
fields=["team", "template"],
|
||||||
|
name="unique_template_team",
|
||||||
|
violation_error_message=_("This team is already in this template."),
|
||||||
|
),
|
||||||
|
]
|
||||||
|
|
||||||
|
def __str__(self):
|
||||||
|
return f"{self.user!s} is {self.role:s} in template {self.template!s}"
|
||||||
|
|
||||||
|
def get_abilities(self, user):
|
||||||
|
"""
|
||||||
|
Compute and return abilities for a given user taking into account
|
||||||
|
the current state of the object.
|
||||||
|
"""
|
||||||
|
is_template_owner_or_admin = False
|
||||||
|
role = None
|
||||||
|
|
||||||
|
if user.is_authenticated:
|
||||||
|
try:
|
||||||
|
role = self.user_role
|
||||||
|
except AttributeError:
|
||||||
|
try:
|
||||||
|
role = self._meta.model.objects.filter(
|
||||||
|
template=self.template_id, user=user
|
||||||
|
).values("role")[0]["role"]
|
||||||
|
except (self._meta.model.DoesNotExist, IndexError):
|
||||||
|
role = None
|
||||||
|
|
||||||
|
is_template_owner_or_admin = role in [RoleChoices.OWNER, RoleChoices.ADMIN]
|
||||||
|
|
||||||
|
if self.role == RoleChoices.OWNER:
|
||||||
|
can_delete = (
|
||||||
|
user.id == self.user_id
|
||||||
|
and self.template.accesses.filter(role=RoleChoices.OWNER).count() > 1
|
||||||
|
)
|
||||||
|
set_role_to = [RoleChoices.ADMIN, RoleChoices.MEMBER] if can_delete else []
|
||||||
|
else:
|
||||||
|
can_delete = is_template_owner_or_admin
|
||||||
|
set_role_to = []
|
||||||
|
if role == RoleChoices.OWNER:
|
||||||
|
set_role_to.append(RoleChoices.OWNER)
|
||||||
|
if is_template_owner_or_admin:
|
||||||
|
set_role_to.extend([RoleChoices.ADMIN, RoleChoices.MEMBER])
|
||||||
|
|
||||||
|
# Remove the current role as we don't want to propose it as an option
|
||||||
|
try:
|
||||||
|
set_role_to.remove(self.role)
|
||||||
|
except ValueError:
|
||||||
|
pass
|
||||||
|
|
||||||
|
return {
|
||||||
|
"delete": can_delete,
|
||||||
|
"get": bool(role),
|
||||||
|
"patch": bool(set_role_to),
|
||||||
|
"put": bool(set_role_to),
|
||||||
|
"set_role_to": set_role_to,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def oidc_user_getter(validated_token):
|
||||||
|
"""
|
||||||
|
Given a valid OIDC token , retrieve, create or update corresponding user/contact/email from db.
|
||||||
|
|
||||||
|
The token is expected to have the following fields in payload:
|
||||||
|
- sub
|
||||||
|
- email
|
||||||
|
- ...
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
user_id = validated_token[api_settings.USER_ID_CLAIM]
|
||||||
|
except KeyError as exc:
|
||||||
|
raise InvalidToken(
|
||||||
|
_("Token contained no recognizable user identification")
|
||||||
|
) from exc
|
||||||
|
|
||||||
|
try:
|
||||||
|
user = User.objects.get(identities__sub=user_id)
|
||||||
|
except User.DoesNotExist:
|
||||||
|
user = User.objects.create()
|
||||||
|
Identities.objects.create(user=user, sub=user_id, email=validated_token["email"])
|
||||||
|
|
||||||
|
return user
|
||||||
14
src/backend/core/templates/core/generate_document.html
Normal file
14
src/backend/core/templates/core/generate_document.html
Normal file
@@ -0,0 +1,14 @@
|
|||||||
|
<!DOCTYPE html>
|
||||||
|
<html>
|
||||||
|
<head>
|
||||||
|
<title>Generate Document</title>
|
||||||
|
</head>
|
||||||
|
<body>
|
||||||
|
<h2>Generate Document</h2>
|
||||||
|
<form method="post" enctype="multipart/form-data">
|
||||||
|
{% csrf_token %}
|
||||||
|
{{ form.as_p }}
|
||||||
|
<button type="submit">Generate PDF</button>
|
||||||
|
</form>
|
||||||
|
</body>
|
||||||
|
</html>
|
||||||
0
src/backend/core/tests/__init__.py
Normal file
0
src/backend/core/tests/__init__.py
Normal file
0
src/backend/core/tests/swagger/__init__.py
Normal file
0
src/backend/core/tests/swagger/__init__.py
Normal file
41
src/backend/core/tests/swagger/test_openapi_schema.py
Normal file
41
src/backend/core/tests/swagger/test_openapi_schema.py
Normal file
@@ -0,0 +1,41 @@
|
|||||||
|
"""
|
||||||
|
Test suite for generated openapi schema.
|
||||||
|
"""
|
||||||
|
import json
|
||||||
|
from io import StringIO
|
||||||
|
|
||||||
|
from django.core.management import call_command
|
||||||
|
from django.test import Client
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
|
||||||
|
pytestmark = pytest.mark.django_db
|
||||||
|
|
||||||
|
|
||||||
|
def test_openapi_client_schema():
|
||||||
|
"""
|
||||||
|
Generated and served OpenAPI client schema should be correct.
|
||||||
|
"""
|
||||||
|
# Start by generating the swagger.json file
|
||||||
|
output = StringIO()
|
||||||
|
call_command(
|
||||||
|
"spectacular",
|
||||||
|
"--api-version",
|
||||||
|
"v1.0",
|
||||||
|
"--urlconf",
|
||||||
|
"publish.api_urls",
|
||||||
|
"--format",
|
||||||
|
"openapi-json",
|
||||||
|
"--file",
|
||||||
|
"core/tests/swagger/swagger.json",
|
||||||
|
stdout=output,
|
||||||
|
)
|
||||||
|
assert output.getvalue() == ""
|
||||||
|
|
||||||
|
response = Client().get("/v1.0/swagger.json")
|
||||||
|
|
||||||
|
assert response.status_code == 200
|
||||||
|
with open(
|
||||||
|
"core/tests/swagger/swagger.json", "r", encoding="utf-8"
|
||||||
|
) as expected_schema:
|
||||||
|
assert response.json() == json.load(expected_schema)
|
||||||
50
src/backend/core/tests/teams/test_core_api_teams_create.py
Normal file
50
src/backend/core/tests/teams/test_core_api_teams_create.py
Normal file
@@ -0,0 +1,50 @@
|
|||||||
|
"""
|
||||||
|
Tests for Teams API endpoint in publish's core app: create
|
||||||
|
"""
|
||||||
|
import pytest
|
||||||
|
from rest_framework.test import APIClient
|
||||||
|
from rest_framework_simplejwt.tokens import AccessToken
|
||||||
|
|
||||||
|
from core.factories import IdentityFactory, TeamFactory, UserFactory
|
||||||
|
from core.models import Team
|
||||||
|
|
||||||
|
from ..utils import OIDCToken
|
||||||
|
|
||||||
|
pytestmark = pytest.mark.django_db
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_teams_create_anonymous():
|
||||||
|
"""Anonymous users should not be allowed to create teams."""
|
||||||
|
response = APIClient().post(
|
||||||
|
"/api/v1.0/teams/",
|
||||||
|
{
|
||||||
|
"name": "my team",
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == 401
|
||||||
|
assert not Team.objects.exists()
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_teams_create_authenticated():
|
||||||
|
"""
|
||||||
|
Authenticated users should be able to create teams and should automatically be declared
|
||||||
|
as the owner of the newly created team.
|
||||||
|
"""
|
||||||
|
identity = IdentityFactory()
|
||||||
|
user = identity.user
|
||||||
|
jwt_token = OIDCToken.for_user(user)
|
||||||
|
|
||||||
|
response = APIClient().post(
|
||||||
|
"/api/v1.0/teams/",
|
||||||
|
{
|
||||||
|
"name": "my team",
|
||||||
|
},
|
||||||
|
format="json",
|
||||||
|
HTTP_AUTHORIZATION=f"Bearer {jwt_token}",
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == 201
|
||||||
|
team = Team.objects.get()
|
||||||
|
assert team.name == "my team"
|
||||||
|
assert team.accesses.filter(role="owner", user=user).exists()
|
||||||
107
src/backend/core/tests/teams/test_core_api_teams_delete.py
Normal file
107
src/backend/core/tests/teams/test_core_api_teams_delete.py
Normal file
@@ -0,0 +1,107 @@
|
|||||||
|
"""
|
||||||
|
Tests for Teams API endpoint in publish's core app: delete
|
||||||
|
"""
|
||||||
|
import pytest
|
||||||
|
from rest_framework.test import APIClient
|
||||||
|
from rest_framework_simplejwt.tokens import AccessToken
|
||||||
|
|
||||||
|
from core import factories, models
|
||||||
|
|
||||||
|
from ..utils import OIDCToken
|
||||||
|
|
||||||
|
pytestmark = pytest.mark.django_db
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_teams_delete_anonymous():
|
||||||
|
"""Anonymous users should not be allowed to destroy a team."""
|
||||||
|
team = factories.TeamFactory()
|
||||||
|
|
||||||
|
response = APIClient().delete(
|
||||||
|
f"/api/v1.0/teams/{team.id!s}/",
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == 401
|
||||||
|
assert models.Team.objects.count() == 1
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_teams_delete_authenticated_unrelated():
|
||||||
|
"""
|
||||||
|
Authenticated users should not be allowed to delete a team to which they are not
|
||||||
|
related.
|
||||||
|
"""
|
||||||
|
identity = factories.IdentityFactory()
|
||||||
|
user = identity.user
|
||||||
|
jwt_token = OIDCToken.for_user(user)
|
||||||
|
|
||||||
|
team = factories.TeamFactory()
|
||||||
|
|
||||||
|
response = APIClient().delete(
|
||||||
|
f"/api/v1.0/teams/{team.id!s}/", HTTP_AUTHORIZATION=f"Bearer {jwt_token}"
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == 404
|
||||||
|
assert response.json() == {"detail": "Not found."}
|
||||||
|
assert models.Team.objects.count() == 1
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_teams_delete_authenticated_member():
|
||||||
|
"""
|
||||||
|
Authenticated users should not be allowed to delete a team for which they are
|
||||||
|
only a member.
|
||||||
|
"""
|
||||||
|
identity = factories.IdentityFactory()
|
||||||
|
user = identity.user
|
||||||
|
jwt_token = OIDCToken.for_user(user)
|
||||||
|
|
||||||
|
team = factories.TeamFactory(users=[(user, "member")])
|
||||||
|
|
||||||
|
response = APIClient().delete(
|
||||||
|
f"/api/v1.0/teams/{team.id}/", HTTP_AUTHORIZATION=f"Bearer {jwt_token}"
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == 403
|
||||||
|
assert response.json() == {
|
||||||
|
"detail": "You do not have permission to perform this action."
|
||||||
|
}
|
||||||
|
assert models.Team.objects.count() == 1
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_teams_delete_authenticated_administrator():
|
||||||
|
"""
|
||||||
|
Authenticated users should not be allowed to delete a team for which they are
|
||||||
|
administrator.
|
||||||
|
"""
|
||||||
|
identity = factories.IdentityFactory()
|
||||||
|
user = identity.user
|
||||||
|
jwt_token = OIDCToken.for_user(user)
|
||||||
|
|
||||||
|
team = factories.TeamFactory(users=[(user, "administrator")])
|
||||||
|
|
||||||
|
response = APIClient().delete(
|
||||||
|
f"/api/v1.0/teams/{team.id}/", HTTP_AUTHORIZATION=f"Bearer {jwt_token}"
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == 403
|
||||||
|
assert response.json() == {
|
||||||
|
"detail": "You do not have permission to perform this action."
|
||||||
|
}
|
||||||
|
assert models.Team.objects.count() == 1
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_teams_delete_authenticated_owner():
|
||||||
|
"""
|
||||||
|
Authenticated users should be able to delete a team for which they are directly
|
||||||
|
owner.
|
||||||
|
"""
|
||||||
|
identity = factories.IdentityFactory()
|
||||||
|
user = identity.user
|
||||||
|
jwt_token = OIDCToken.for_user(user)
|
||||||
|
|
||||||
|
team = factories.TeamFactory(users=[(user, "owner")])
|
||||||
|
|
||||||
|
response = APIClient().delete(
|
||||||
|
f"/api/v1.0/teams/{team.id}/", HTTP_AUTHORIZATION=f"Bearer {jwt_token}"
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == 204
|
||||||
|
assert models.Team.objects.exists() is False
|
||||||
118
src/backend/core/tests/teams/test_core_api_teams_list.py
Normal file
118
src/backend/core/tests/teams/test_core_api_teams_list.py
Normal file
@@ -0,0 +1,118 @@
|
|||||||
|
"""
|
||||||
|
Tests for Teams API endpoint in publish's core app: list
|
||||||
|
"""
|
||||||
|
from unittest import mock
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
from rest_framework.pagination import PageNumberPagination
|
||||||
|
from rest_framework.status import HTTP_200_OK, HTTP_401_UNAUTHORIZED
|
||||||
|
from rest_framework.test import APIClient
|
||||||
|
|
||||||
|
from core import factories, models
|
||||||
|
from core.api import serializers
|
||||||
|
|
||||||
|
from ..utils import OIDCToken
|
||||||
|
|
||||||
|
pytestmark = pytest.mark.django_db
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_teams_list_anonymous():
|
||||||
|
"""Anonymous users should not be allowed to list teams."""
|
||||||
|
factories.TeamFactory.create_batch(2)
|
||||||
|
|
||||||
|
response = APIClient().get("/api/v1.0/teams/")
|
||||||
|
|
||||||
|
assert response.status_code == HTTP_401_UNAUTHORIZED
|
||||||
|
assert response.json() == {
|
||||||
|
"detail": "Authentication credentials were not provided."
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_teams_list_authenticated():
|
||||||
|
"""Authenticated users should be able to list teams they are an owner/administrator/member of."""
|
||||||
|
identity = factories.IdentityFactory()
|
||||||
|
user = identity.user
|
||||||
|
jwt_token = OIDCToken.for_user(user)
|
||||||
|
|
||||||
|
expected_ids = {
|
||||||
|
str(access.team.id)
|
||||||
|
for access in factories.TeamAccessFactory.create_batch(5, user=user)
|
||||||
|
}
|
||||||
|
factories.TeamFactory.create_batch(2) # Other teams
|
||||||
|
|
||||||
|
response = APIClient().get(
|
||||||
|
"/api/v1.0/teams/", HTTP_AUTHORIZATION=f"Bearer {jwt_token}"
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == HTTP_200_OK
|
||||||
|
results = response.json()["results"]
|
||||||
|
assert len(results) == 5
|
||||||
|
results_id = {result["id"] for result in results}
|
||||||
|
assert expected_ids == results_id
|
||||||
|
|
||||||
|
|
||||||
|
@mock.patch.object(PageNumberPagination, "get_page_size", return_value=2)
|
||||||
|
def test_api_teams_list_pagination(
|
||||||
|
_mock_page_size,
|
||||||
|
):
|
||||||
|
"""Pagination should work as expected."""
|
||||||
|
identity = factories.IdentityFactory()
|
||||||
|
user = identity.user
|
||||||
|
jwt_token = OIDCToken.for_user(user)
|
||||||
|
|
||||||
|
team_ids = [
|
||||||
|
str(access.team.id)
|
||||||
|
for access in factories.TeamAccessFactory.create_batch(3, user=user)
|
||||||
|
]
|
||||||
|
|
||||||
|
# Get page 1
|
||||||
|
response = APIClient().get(
|
||||||
|
"/api/v1.0/teams/", HTTP_AUTHORIZATION=f"Bearer {jwt_token}"
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == HTTP_200_OK
|
||||||
|
content = response.json()
|
||||||
|
|
||||||
|
assert content["count"] == 3
|
||||||
|
assert content["next"] == "http://testserver/api/v1.0/teams/?page=2"
|
||||||
|
assert content["previous"] is None
|
||||||
|
|
||||||
|
assert len(content["results"]) == 2
|
||||||
|
for item in content["results"]:
|
||||||
|
team_ids.remove(item["id"])
|
||||||
|
|
||||||
|
# Get page 2
|
||||||
|
response = APIClient().get(
|
||||||
|
"/api/v1.0/teams/?page=2", HTTP_AUTHORIZATION=f"Bearer {jwt_token}"
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == HTTP_200_OK
|
||||||
|
content = response.json()
|
||||||
|
|
||||||
|
assert content["count"] == 3
|
||||||
|
assert content["next"] is None
|
||||||
|
assert content["previous"] == "http://testserver/api/v1.0/teams/"
|
||||||
|
|
||||||
|
assert len(content["results"]) == 1
|
||||||
|
team_ids.remove(content["results"][0]["id"])
|
||||||
|
assert team_ids == []
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_teams_list_authenticated_distinct():
|
||||||
|
"""A team with several related users should only be listed once."""
|
||||||
|
identity = factories.IdentityFactory()
|
||||||
|
user = identity.user
|
||||||
|
jwt_token = OIDCToken.for_user(user)
|
||||||
|
|
||||||
|
other_user = factories.UserFactory()
|
||||||
|
|
||||||
|
team = factories.TeamFactory(users=[user, other_user])
|
||||||
|
|
||||||
|
response = APIClient().get(
|
||||||
|
"/api/v1.0/teams/", HTTP_AUTHORIZATION=f"Bearer {jwt_token}"
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == HTTP_200_OK
|
||||||
|
content = response.json()
|
||||||
|
assert len(content["results"]) == 1
|
||||||
|
assert content["results"][0]["id"] == str(team.id)
|
||||||
86
src/backend/core/tests/teams/test_core_api_teams_retrieve.py
Normal file
86
src/backend/core/tests/teams/test_core_api_teams_retrieve.py
Normal file
@@ -0,0 +1,86 @@
|
|||||||
|
"""
|
||||||
|
Tests for Teams API endpoint in publish's core app: retrieve
|
||||||
|
"""
|
||||||
|
import random
|
||||||
|
from collections import Counter
|
||||||
|
from unittest import mock
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
from rest_framework.test import APIClient
|
||||||
|
|
||||||
|
from core import factories
|
||||||
|
|
||||||
|
from ..utils import OIDCToken
|
||||||
|
|
||||||
|
pytestmark = pytest.mark.django_db
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_teams_retrieve_anonymous():
|
||||||
|
"""Anonymous users should not be allowed to retrieve a team."""
|
||||||
|
team = factories.TeamFactory()
|
||||||
|
response = APIClient().get(f"/api/v1.0/teams/{team.id}/")
|
||||||
|
|
||||||
|
assert response.status_code == 401
|
||||||
|
assert response.json() == {
|
||||||
|
"detail": "Authentication credentials were not provided."
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_teams_retrieve_authenticated_unrelated():
|
||||||
|
"""
|
||||||
|
Authenticated users should not be allowed to retrieve a team to which they are
|
||||||
|
not related.
|
||||||
|
"""
|
||||||
|
identity = factories.IdentityFactory()
|
||||||
|
user = identity.user
|
||||||
|
jwt_token = OIDCToken.for_user(user)
|
||||||
|
|
||||||
|
team = factories.TeamFactory()
|
||||||
|
|
||||||
|
response = APIClient().get(
|
||||||
|
f"/api/v1.0/teams/{team.id!s}/", HTTP_AUTHORIZATION=f"Bearer {jwt_token}"
|
||||||
|
)
|
||||||
|
assert response.status_code == 404
|
||||||
|
assert response.json() == {"detail": "Not found."}
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_teams_retrieve_authenticated_related():
|
||||||
|
"""
|
||||||
|
Authenticated users should be allowed to retrieve a team to which they
|
||||||
|
are related whatever the role.
|
||||||
|
"""
|
||||||
|
identity = factories.IdentityFactory()
|
||||||
|
user = identity.user
|
||||||
|
jwt_token = OIDCToken.for_user(user)
|
||||||
|
|
||||||
|
team = factories.TeamFactory()
|
||||||
|
access1 = factories.TeamAccessFactory(team=team, user=user)
|
||||||
|
access2 = factories.TeamAccessFactory(team=team)
|
||||||
|
|
||||||
|
response = APIClient().get(
|
||||||
|
f"/api/v1.0/teams/{team.id!s}/", HTTP_AUTHORIZATION=f"Bearer {jwt_token}"
|
||||||
|
)
|
||||||
|
assert response.status_code == 200
|
||||||
|
content = response.json()
|
||||||
|
assert sorted(content.pop("accesses"), key=lambda x: x["user"]) == sorted(
|
||||||
|
[
|
||||||
|
{
|
||||||
|
"id": str(access1.id),
|
||||||
|
"user": str(user.id),
|
||||||
|
"role": access1.role,
|
||||||
|
"abilities": access1.get_abilities(user),
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"id": str(access2.id),
|
||||||
|
"user": str(access2.user.id),
|
||||||
|
"role": access2.role,
|
||||||
|
"abilities": access2.get_abilities(user),
|
||||||
|
},
|
||||||
|
],
|
||||||
|
key=lambda x: x["user"],
|
||||||
|
)
|
||||||
|
assert response.json() == {
|
||||||
|
"id": str(team.id),
|
||||||
|
"name": team.name,
|
||||||
|
"abilities": team.get_abilities(user),
|
||||||
|
}
|
||||||
176
src/backend/core/tests/teams/test_core_api_teams_update.py
Normal file
176
src/backend/core/tests/teams/test_core_api_teams_update.py
Normal file
@@ -0,0 +1,176 @@
|
|||||||
|
"""
|
||||||
|
Tests for Teams API endpoint in publish's core app: update
|
||||||
|
"""
|
||||||
|
import random
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
from rest_framework.test import APIClient
|
||||||
|
from rest_framework_simplejwt.tokens import AccessToken
|
||||||
|
|
||||||
|
from core import factories, models
|
||||||
|
from core.api import serializers
|
||||||
|
|
||||||
|
from ..utils import OIDCToken
|
||||||
|
|
||||||
|
pytestmark = pytest.mark.django_db
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_teams_update_anonymous():
|
||||||
|
"""Anonymous users should not be allowed to update a team."""
|
||||||
|
team = factories.TeamFactory()
|
||||||
|
old_team_values = serializers.TeamSerializer(instance=team).data
|
||||||
|
|
||||||
|
new_team_values = serializers.TeamSerializer(instance=factories.TeamFactory()).data
|
||||||
|
response = APIClient().put(
|
||||||
|
f"/api/v1.0/teams/{team.id!s}/",
|
||||||
|
new_team_values,
|
||||||
|
format="json",
|
||||||
|
)
|
||||||
|
assert response.status_code == 401
|
||||||
|
assert response.json() == {
|
||||||
|
"detail": "Authentication credentials were not provided."
|
||||||
|
}
|
||||||
|
|
||||||
|
team.refresh_from_db()
|
||||||
|
team_values = serializers.TeamSerializer(instance=team).data
|
||||||
|
assert team_values == old_team_values
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_teams_update_authenticated_unrelated():
|
||||||
|
"""
|
||||||
|
Authenticated users should not be allowed to update a team to which they are not related.
|
||||||
|
"""
|
||||||
|
identity = factories.IdentityFactory()
|
||||||
|
user = identity.user
|
||||||
|
jwt_token = OIDCToken.for_user(user)
|
||||||
|
|
||||||
|
team = factories.TeamFactory()
|
||||||
|
old_team_values = serializers.TeamSerializer(instance=team).data
|
||||||
|
|
||||||
|
new_team_values = serializers.TeamSerializer(instance=factories.TeamFactory()).data
|
||||||
|
response = APIClient().put(
|
||||||
|
f"/api/v1.0/teams/{team.id!s}/",
|
||||||
|
new_team_values,
|
||||||
|
format="json",
|
||||||
|
HTTP_AUTHORIZATION=f"Bearer {jwt_token}",
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == 404
|
||||||
|
assert response.json() == {"detail": "Not found."}
|
||||||
|
|
||||||
|
team.refresh_from_db()
|
||||||
|
team_values = serializers.TeamSerializer(instance=team).data
|
||||||
|
assert team_values == old_team_values
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_teams_update_authenticated_members():
|
||||||
|
"""
|
||||||
|
Users who are members of a team but not administrators should
|
||||||
|
not be allowed to update it.
|
||||||
|
"""
|
||||||
|
identity = factories.IdentityFactory()
|
||||||
|
user = identity.user
|
||||||
|
jwt_token = OIDCToken.for_user(user)
|
||||||
|
|
||||||
|
team = factories.TeamFactory(users=[(user, "member")])
|
||||||
|
old_team_values = serializers.TeamSerializer(instance=team).data
|
||||||
|
|
||||||
|
new_team_values = serializers.TeamSerializer(instance=factories.TeamFactory()).data
|
||||||
|
response = APIClient().put(
|
||||||
|
f"/api/v1.0/teams/{team.id!s}/",
|
||||||
|
new_team_values,
|
||||||
|
format="json",
|
||||||
|
HTTP_AUTHORIZATION=f"Bearer {jwt_token}",
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == 403
|
||||||
|
assert response.json() == {
|
||||||
|
"detail": "You do not have permission to perform this action."
|
||||||
|
}
|
||||||
|
|
||||||
|
team.refresh_from_db()
|
||||||
|
team_values = serializers.TeamSerializer(instance=team).data
|
||||||
|
assert team_values == old_team_values
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_teams_update_authenticated_administrators():
|
||||||
|
"""Administrators of a team should be allowed to update it."""
|
||||||
|
identity = factories.IdentityFactory()
|
||||||
|
user = identity.user
|
||||||
|
jwt_token = OIDCToken.for_user(user)
|
||||||
|
|
||||||
|
team = factories.TeamFactory(users=[(user, "administrator")])
|
||||||
|
old_team_values = serializers.TeamSerializer(instance=team).data
|
||||||
|
|
||||||
|
new_team_values = serializers.TeamSerializer(instance=factories.TeamFactory()).data
|
||||||
|
response = APIClient().put(
|
||||||
|
f"/api/v1.0/teams/{team.id!s}/",
|
||||||
|
new_team_values,
|
||||||
|
format="json",
|
||||||
|
HTTP_AUTHORIZATION=f"Bearer {jwt_token}",
|
||||||
|
)
|
||||||
|
assert response.status_code == 200
|
||||||
|
|
||||||
|
team.refresh_from_db()
|
||||||
|
team_values = serializers.TeamSerializer(instance=team).data
|
||||||
|
for key, value in team_values.items():
|
||||||
|
if key in ["id", "accesses"]:
|
||||||
|
assert value == old_team_values[key]
|
||||||
|
else:
|
||||||
|
assert value == new_team_values[key]
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_teams_update_authenticated_owners():
|
||||||
|
"""Administrators of a team should be allowed to update it."""
|
||||||
|
identity = factories.IdentityFactory()
|
||||||
|
user = identity.user
|
||||||
|
jwt_token = OIDCToken.for_user(user)
|
||||||
|
|
||||||
|
team = factories.TeamFactory(users=[(user, "owner")])
|
||||||
|
old_team_values = serializers.TeamSerializer(instance=team).data
|
||||||
|
|
||||||
|
new_team_values = serializers.TeamSerializer(instance=factories.TeamFactory()).data
|
||||||
|
response = APIClient().put(
|
||||||
|
f"/api/v1.0/teams/{team.id!s}/",
|
||||||
|
new_team_values,
|
||||||
|
format="json",
|
||||||
|
HTTP_AUTHORIZATION=f"Bearer {jwt_token}",
|
||||||
|
)
|
||||||
|
assert response.status_code == 200
|
||||||
|
|
||||||
|
team.refresh_from_db()
|
||||||
|
team_values = serializers.TeamSerializer(instance=team).data
|
||||||
|
for key, value in team_values.items():
|
||||||
|
if key in ["id", "accesses"]:
|
||||||
|
assert value == old_team_values[key]
|
||||||
|
else:
|
||||||
|
assert value == new_team_values[key]
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_teams_update_administrator_or_owner_of_another():
|
||||||
|
"""
|
||||||
|
Being administrator or owner of a team should not grant authorization to update
|
||||||
|
another team.
|
||||||
|
"""
|
||||||
|
identity = factories.IdentityFactory()
|
||||||
|
user = identity.user
|
||||||
|
jwt_token = OIDCToken.for_user(user)
|
||||||
|
|
||||||
|
factories.TeamFactory(users=[(user, random.choice(["administrator", "owner"]))])
|
||||||
|
team = factories.TeamFactory(name="Old name")
|
||||||
|
old_team_values = serializers.TeamSerializer(instance=team).data
|
||||||
|
|
||||||
|
new_team_values = serializers.TeamSerializer(instance=factories.TeamFactory()).data
|
||||||
|
response = APIClient().put(
|
||||||
|
f"/api/v1.0/teams/{team.id!s}/",
|
||||||
|
new_team_values,
|
||||||
|
format="json",
|
||||||
|
HTTP_AUTHORIZATION=f"Bearer {jwt_token}",
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == 404
|
||||||
|
assert response.json() == {"detail": "Not found."}
|
||||||
|
|
||||||
|
team.refresh_from_db()
|
||||||
|
team_values = serializers.TeamSerializer(instance=team).data
|
||||||
|
assert team_values == old_team_values
|
||||||
843
src/backend/core/tests/test_api_team_accesses.py
Normal file
843
src/backend/core/tests/test_api_team_accesses.py
Normal file
@@ -0,0 +1,843 @@
|
|||||||
|
"""
|
||||||
|
Test team accesses API endpoints for users in publish's core app.
|
||||||
|
"""
|
||||||
|
import random
|
||||||
|
from uuid import uuid4
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
from rest_framework.test import APIClient
|
||||||
|
|
||||||
|
from core import factories, models
|
||||||
|
from core.api import serializers
|
||||||
|
|
||||||
|
from .utils import OIDCToken
|
||||||
|
|
||||||
|
pytestmark = pytest.mark.django_db
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_team_accesses_list_anonymous():
|
||||||
|
"""Anonymous users should not be allowed to list team accesses."""
|
||||||
|
team = factories.TeamFactory()
|
||||||
|
factories.TeamAccessFactory.create_batch(2, team=team)
|
||||||
|
|
||||||
|
response = APIClient().get(f"/api/v1.0/teams/{team.id!s}/accesses/")
|
||||||
|
assert response.status_code == 401
|
||||||
|
assert response.json() == {
|
||||||
|
"detail": "Authentication credentials were not provided."
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_team_accesses_list_authenticated_unrelated():
|
||||||
|
"""
|
||||||
|
Authenticated users should not be allowed to list team accesses for a team
|
||||||
|
to which they are not related.
|
||||||
|
"""
|
||||||
|
identity = factories.IdentityFactory()
|
||||||
|
user = identity.user
|
||||||
|
jwt_token = OIDCToken.for_user(user)
|
||||||
|
|
||||||
|
team = factories.TeamFactory()
|
||||||
|
factories.TeamAccessFactory.create_batch(3, team=team)
|
||||||
|
|
||||||
|
# Accesses for other teams to which the user is related should not be listed either
|
||||||
|
other_access = factories.TeamAccessFactory(user=user)
|
||||||
|
factories.TeamAccessFactory(team=other_access.team)
|
||||||
|
|
||||||
|
response = APIClient().get(
|
||||||
|
f"/api/v1.0/teams/{team.id!s}/accesses/",
|
||||||
|
HTTP_AUTHORIZATION=f"Bearer {jwt_token}",
|
||||||
|
)
|
||||||
|
assert response.status_code == 200
|
||||||
|
assert response.json() == {
|
||||||
|
"count": 0,
|
||||||
|
"next": None,
|
||||||
|
"previous": None,
|
||||||
|
"results": [],
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_team_accesses_list_authenticated_related():
|
||||||
|
"""
|
||||||
|
Authenticated users should be able to list team accesses for a team
|
||||||
|
to which they are related, whatever their role in the team.
|
||||||
|
"""
|
||||||
|
identity = factories.IdentityFactory()
|
||||||
|
user = identity.user
|
||||||
|
jwt_token = OIDCToken.for_user(user)
|
||||||
|
|
||||||
|
team = factories.TeamFactory()
|
||||||
|
user_access = models.TeamAccess.objects.create(team=team, user=user) # random role
|
||||||
|
access1, access2 = factories.TeamAccessFactory.create_batch(2, team=team)
|
||||||
|
|
||||||
|
# Accesses for other teams to which the user is related should not be listed either
|
||||||
|
other_access = factories.TeamAccessFactory(user=user)
|
||||||
|
factories.TeamAccessFactory(team=other_access.team)
|
||||||
|
|
||||||
|
response = APIClient().get(
|
||||||
|
f"/api/v1.0/teams/{team.id!s}/accesses/",
|
||||||
|
HTTP_AUTHORIZATION=f"Bearer {jwt_token}",
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == 200
|
||||||
|
content = response.json()
|
||||||
|
assert len(content["results"]) == 3
|
||||||
|
assert sorted(content["results"], key=lambda x: x["id"]) == sorted(
|
||||||
|
[
|
||||||
|
{
|
||||||
|
"id": str(user_access.id),
|
||||||
|
"user": str(user.id),
|
||||||
|
"role": user_access.role,
|
||||||
|
"abilities": user_access.get_abilities(user),
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"id": str(access1.id),
|
||||||
|
"user": str(access1.user.id),
|
||||||
|
"role": access1.role,
|
||||||
|
"abilities": access1.get_abilities(user),
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"id": str(access2.id),
|
||||||
|
"user": str(access2.user.id),
|
||||||
|
"role": access2.role,
|
||||||
|
"abilities": access2.get_abilities(user),
|
||||||
|
},
|
||||||
|
],
|
||||||
|
key=lambda x: x["id"],
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_team_accesses_retrieve_anonymous():
|
||||||
|
"""
|
||||||
|
Anonymous users should not be allowed to retrieve a team access.
|
||||||
|
"""
|
||||||
|
access = factories.TeamAccessFactory()
|
||||||
|
|
||||||
|
response = APIClient().get(
|
||||||
|
f"/api/v1.0/teams/{access.team.id!s}/accesses/{access.id!s}/",
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == 401
|
||||||
|
assert response.json() == {
|
||||||
|
"detail": "Authentication credentials were not provided."
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_team_accesses_retrieve_authenticated_unrelated():
|
||||||
|
"""
|
||||||
|
Authenticated users should not be allowed to retrieve a team access for
|
||||||
|
a team to which they are not related.
|
||||||
|
"""
|
||||||
|
identity = factories.IdentityFactory()
|
||||||
|
user = identity.user
|
||||||
|
jwt_token = OIDCToken.for_user(user)
|
||||||
|
|
||||||
|
team = factories.TeamFactory()
|
||||||
|
access = factories.TeamAccessFactory(team=team)
|
||||||
|
|
||||||
|
response = APIClient().get(
|
||||||
|
f"/api/v1.0/teams/{team.id!s}/accesses/{access.id!s}/",
|
||||||
|
HTTP_AUTHORIZATION=f"Bearer {jwt_token}",
|
||||||
|
)
|
||||||
|
assert response.status_code == 403
|
||||||
|
assert response.json() == {
|
||||||
|
"detail": "You do not have permission to perform this action."
|
||||||
|
}
|
||||||
|
|
||||||
|
# Accesses related to another team should be excluded even if the user is related to it
|
||||||
|
for access in [
|
||||||
|
factories.TeamAccessFactory(),
|
||||||
|
factories.TeamAccessFactory(user=user),
|
||||||
|
]:
|
||||||
|
response = APIClient().get(
|
||||||
|
f"/api/v1.0/teams/{team.id!s}/accesses/{access.id!s}/",
|
||||||
|
HTTP_AUTHORIZATION=f"Bearer {jwt_token}",
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == 404
|
||||||
|
assert response.json() == {"detail": "Not found."}
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_team_accesses_retrieve_authenticated_related():
|
||||||
|
"""
|
||||||
|
A user who is related to a team should be allowed to retrieve the
|
||||||
|
associated team user accesses.
|
||||||
|
"""
|
||||||
|
identity = factories.IdentityFactory()
|
||||||
|
user = identity.user
|
||||||
|
jwt_token = OIDCToken.for_user(user)
|
||||||
|
|
||||||
|
team = factories.TeamFactory(users=[user])
|
||||||
|
access = factories.TeamAccessFactory(team=team)
|
||||||
|
|
||||||
|
response = APIClient().get(
|
||||||
|
f"/api/v1.0/teams/{team.id!s}/accesses/{access.id!s}/",
|
||||||
|
HTTP_AUTHORIZATION=f"Bearer {jwt_token}",
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == 200
|
||||||
|
assert response.json() == {
|
||||||
|
"id": str(access.id),
|
||||||
|
"user": str(access.user.id),
|
||||||
|
"role": access.role,
|
||||||
|
"abilities": access.get_abilities(user),
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_team_accesses_create_anonymous():
|
||||||
|
"""Anonymous users should not be allowed to create team accesses."""
|
||||||
|
user = factories.UserFactory()
|
||||||
|
team = factories.TeamFactory()
|
||||||
|
|
||||||
|
response = APIClient().post(
|
||||||
|
f"/api/v1.0/teams/{team.id!s}/accesses/",
|
||||||
|
{
|
||||||
|
"user": str(user.id),
|
||||||
|
"team": str(team.id),
|
||||||
|
"role": random.choice(models.RoleChoices.choices)[0],
|
||||||
|
},
|
||||||
|
format="json",
|
||||||
|
)
|
||||||
|
assert response.status_code == 401
|
||||||
|
assert response.json() == {
|
||||||
|
"detail": "Authentication credentials were not provided."
|
||||||
|
}
|
||||||
|
assert models.TeamAccess.objects.exists() is False
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_team_accesses_create_authenticated_unrelated():
|
||||||
|
"""
|
||||||
|
Authenticated users should not be allowed to create team accesses for a team to
|
||||||
|
which they are not related.
|
||||||
|
"""
|
||||||
|
identity = factories.IdentityFactory()
|
||||||
|
user = identity.user
|
||||||
|
jwt_token = OIDCToken.for_user(user)
|
||||||
|
|
||||||
|
other_user = factories.UserFactory()
|
||||||
|
team = factories.TeamFactory()
|
||||||
|
|
||||||
|
response = APIClient().post(
|
||||||
|
f"/api/v1.0/teams/{team.id!s}/accesses/",
|
||||||
|
{
|
||||||
|
"user": str(other_user.id),
|
||||||
|
},
|
||||||
|
format="json",
|
||||||
|
HTTP_AUTHORIZATION=f"Bearer {jwt_token}",
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == 403
|
||||||
|
assert response.json() == {
|
||||||
|
"detail": "You are not allowed to manage accesses for this team."
|
||||||
|
}
|
||||||
|
assert not models.TeamAccess.objects.filter(user=other_user).exists()
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_team_accesses_create_authenticated_member():
|
||||||
|
"""Members of a team should not be allowed to create team accesses."""
|
||||||
|
identity = factories.IdentityFactory()
|
||||||
|
user = identity.user
|
||||||
|
jwt_token = OIDCToken.for_user(user)
|
||||||
|
|
||||||
|
team = factories.TeamFactory(users=[(user, "member")])
|
||||||
|
other_user = factories.UserFactory()
|
||||||
|
|
||||||
|
api_client = APIClient()
|
||||||
|
for role in [role[0] for role in models.RoleChoices.choices]:
|
||||||
|
response = api_client.post(
|
||||||
|
f"/api/v1.0/teams/{team.id!s}/accesses/",
|
||||||
|
{
|
||||||
|
"user": str(other_user.id),
|
||||||
|
"role": role,
|
||||||
|
},
|
||||||
|
format="json",
|
||||||
|
HTTP_AUTHORIZATION=f"Bearer {jwt_token}",
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == 403
|
||||||
|
assert response.json() == {
|
||||||
|
"detail": "You are not allowed to manage accesses for this team."
|
||||||
|
}
|
||||||
|
|
||||||
|
assert not models.TeamAccess.objects.filter(user=other_user).exists()
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_team_accesses_create_authenticated_administrator():
|
||||||
|
"""
|
||||||
|
Administrators of a team should be able to create team accesses except for the "owner" role.
|
||||||
|
"""
|
||||||
|
identity = factories.IdentityFactory()
|
||||||
|
user = identity.user
|
||||||
|
jwt_token = OIDCToken.for_user(user)
|
||||||
|
|
||||||
|
team = factories.TeamFactory(users=[(user, "administrator")])
|
||||||
|
other_user = factories.UserFactory()
|
||||||
|
|
||||||
|
api_client = APIClient()
|
||||||
|
|
||||||
|
# It should not be allowed to create an owner access
|
||||||
|
response = api_client.post(
|
||||||
|
f"/api/v1.0/teams/{team.id!s}/accesses/",
|
||||||
|
{
|
||||||
|
"user": str(other_user.id),
|
||||||
|
"role": "owner",
|
||||||
|
},
|
||||||
|
format="json",
|
||||||
|
HTTP_AUTHORIZATION=f"Bearer {jwt_token}",
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == 403
|
||||||
|
assert response.json() == {
|
||||||
|
"detail": "Only owners of a team can assign other users as owners."
|
||||||
|
}
|
||||||
|
|
||||||
|
# It should be allowed to create a lower access
|
||||||
|
role = random.choice(
|
||||||
|
[role[0] for role in models.RoleChoices.choices if role[0] != "owner"]
|
||||||
|
)
|
||||||
|
|
||||||
|
response = api_client.post(
|
||||||
|
f"/api/v1.0/teams/{team.id!s}/accesses/",
|
||||||
|
{
|
||||||
|
"user": str(other_user.id),
|
||||||
|
"role": role,
|
||||||
|
},
|
||||||
|
format="json",
|
||||||
|
HTTP_AUTHORIZATION=f"Bearer {jwt_token}",
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == 201
|
||||||
|
assert models.TeamAccess.objects.filter(user=other_user).count() == 1
|
||||||
|
new_team_access = models.TeamAccess.objects.filter(user=other_user).get()
|
||||||
|
assert response.json() == {
|
||||||
|
"abilities": new_team_access.get_abilities(user),
|
||||||
|
"id": str(new_team_access.id),
|
||||||
|
"role": role,
|
||||||
|
"user": str(other_user.id),
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_team_accesses_create_authenticated_owner():
|
||||||
|
"""
|
||||||
|
Owners of a team should be able to create team accesses whatever the role.
|
||||||
|
"""
|
||||||
|
identity = factories.IdentityFactory()
|
||||||
|
user = identity.user
|
||||||
|
jwt_token = OIDCToken.for_user(user)
|
||||||
|
|
||||||
|
team = factories.TeamFactory(users=[(user, "owner")])
|
||||||
|
other_user = factories.UserFactory()
|
||||||
|
|
||||||
|
role = random.choice([role[0] for role in models.RoleChoices.choices])
|
||||||
|
|
||||||
|
response = APIClient().post(
|
||||||
|
f"/api/v1.0/teams/{team.id!s}/accesses/",
|
||||||
|
{
|
||||||
|
"user": str(other_user.id),
|
||||||
|
"role": role,
|
||||||
|
},
|
||||||
|
format="json",
|
||||||
|
HTTP_AUTHORIZATION=f"Bearer {jwt_token}",
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == 201
|
||||||
|
assert models.TeamAccess.objects.filter(user=other_user).count() == 1
|
||||||
|
new_team_access = models.TeamAccess.objects.filter(user=other_user).get()
|
||||||
|
assert response.json() == {
|
||||||
|
"abilities": new_team_access.get_abilities(user),
|
||||||
|
"id": str(new_team_access.id),
|
||||||
|
"role": role,
|
||||||
|
"user": str(other_user.id),
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_team_accesses_update_anonymous():
|
||||||
|
"""Anonymous users should not be allowed to update a team access."""
|
||||||
|
access = factories.TeamAccessFactory()
|
||||||
|
old_values = serializers.TeamAccessSerializer(instance=access).data
|
||||||
|
|
||||||
|
new_values = {
|
||||||
|
"id": uuid4(),
|
||||||
|
"user": factories.UserFactory().id,
|
||||||
|
"role": random.choice(models.RoleChoices.choices)[0],
|
||||||
|
}
|
||||||
|
|
||||||
|
api_client = APIClient()
|
||||||
|
for field, value in new_values.items():
|
||||||
|
response = api_client.put(
|
||||||
|
f"/api/v1.0/teams/{access.team.id!s}/accesses/{access.id!s}/",
|
||||||
|
{**old_values, field: value},
|
||||||
|
format="json",
|
||||||
|
)
|
||||||
|
assert response.status_code == 401
|
||||||
|
|
||||||
|
access.refresh_from_db()
|
||||||
|
updated_values = serializers.TeamAccessSerializer(instance=access).data
|
||||||
|
assert updated_values == old_values
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_team_accesses_update_authenticated_unrelated():
|
||||||
|
"""
|
||||||
|
Authenticated users should not be allowed to update a team access for a team to which
|
||||||
|
they are not related.
|
||||||
|
"""
|
||||||
|
identity = factories.IdentityFactory()
|
||||||
|
user = identity.user
|
||||||
|
jwt_token = OIDCToken.for_user(user)
|
||||||
|
|
||||||
|
access = factories.TeamAccessFactory()
|
||||||
|
old_values = serializers.TeamAccessSerializer(instance=access).data
|
||||||
|
|
||||||
|
new_values = {
|
||||||
|
"id": uuid4(),
|
||||||
|
"user": factories.UserFactory().id,
|
||||||
|
"role": random.choice(models.RoleChoices.choices)[0],
|
||||||
|
}
|
||||||
|
|
||||||
|
api_client = APIClient()
|
||||||
|
for field, value in new_values.items():
|
||||||
|
response = api_client.put(
|
||||||
|
f"/api/v1.0/teams/{access.team.id!s}/accesses/{access.id!s}/",
|
||||||
|
{**old_values, field: value},
|
||||||
|
format="json",
|
||||||
|
HTTP_AUTHORIZATION=f"Bearer {jwt_token}",
|
||||||
|
)
|
||||||
|
assert response.status_code == 403
|
||||||
|
|
||||||
|
access.refresh_from_db()
|
||||||
|
updated_values = serializers.TeamAccessSerializer(instance=access).data
|
||||||
|
assert updated_values == old_values
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_team_accesses_update_authenticated_member():
|
||||||
|
"""Members of a team should not be allowed to update its accesses."""
|
||||||
|
identity = factories.IdentityFactory()
|
||||||
|
user = identity.user
|
||||||
|
jwt_token = OIDCToken.for_user(user)
|
||||||
|
|
||||||
|
team = factories.TeamFactory(users=[(user, "member")])
|
||||||
|
access = factories.TeamAccessFactory(team=team)
|
||||||
|
old_values = serializers.TeamAccessSerializer(instance=access).data
|
||||||
|
|
||||||
|
new_values = {
|
||||||
|
"id": uuid4(),
|
||||||
|
"user": factories.UserFactory().id,
|
||||||
|
"role": random.choice(models.RoleChoices.choices)[0],
|
||||||
|
}
|
||||||
|
|
||||||
|
api_client = APIClient()
|
||||||
|
for field, value in new_values.items():
|
||||||
|
response = api_client.put(
|
||||||
|
f"/api/v1.0/teams/{access.team.id!s}/accesses/{access.id!s}/",
|
||||||
|
{**old_values, field: value},
|
||||||
|
format="json",
|
||||||
|
HTTP_AUTHORIZATION=f"Bearer {jwt_token}",
|
||||||
|
)
|
||||||
|
assert response.status_code == 403
|
||||||
|
|
||||||
|
access.refresh_from_db()
|
||||||
|
updated_values = serializers.TeamAccessSerializer(instance=access).data
|
||||||
|
assert updated_values == old_values
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_team_accesses_update_administrator_except_owner():
|
||||||
|
"""
|
||||||
|
A user who is an administrator in a team should be allowed to update a user
|
||||||
|
access for this team, as long as they don't try to set the role to owner.
|
||||||
|
"""
|
||||||
|
identity = factories.IdentityFactory()
|
||||||
|
user = identity.user
|
||||||
|
jwt_token = OIDCToken.for_user(user)
|
||||||
|
|
||||||
|
team = factories.TeamFactory(users=[(user, "administrator")])
|
||||||
|
access = factories.TeamAccessFactory(
|
||||||
|
team=team,
|
||||||
|
role=random.choice(["administrator", "member"]),
|
||||||
|
)
|
||||||
|
old_values = serializers.TeamAccessSerializer(instance=access).data
|
||||||
|
|
||||||
|
new_values = {
|
||||||
|
"id": uuid4(),
|
||||||
|
"user_id": factories.UserFactory().id,
|
||||||
|
"role": random.choice(["administrator", "member"]),
|
||||||
|
}
|
||||||
|
|
||||||
|
api_client = APIClient()
|
||||||
|
for field, value in new_values.items():
|
||||||
|
new_data = {**old_values, field: value}
|
||||||
|
response = api_client.put(
|
||||||
|
f"/api/v1.0/teams/{team.id!s}/accesses/{access.id!s}/",
|
||||||
|
data=new_data,
|
||||||
|
format="json",
|
||||||
|
HTTP_AUTHORIZATION=f"Bearer {jwt_token}",
|
||||||
|
)
|
||||||
|
|
||||||
|
if (
|
||||||
|
new_data["role"] == old_values["role"]
|
||||||
|
): # we are not really updating the role
|
||||||
|
assert response.status_code == 403
|
||||||
|
else:
|
||||||
|
assert response.status_code == 200
|
||||||
|
|
||||||
|
access.refresh_from_db()
|
||||||
|
updated_values = serializers.TeamAccessSerializer(instance=access).data
|
||||||
|
if field == "role":
|
||||||
|
assert updated_values == {**old_values, "role": new_values["role"]}
|
||||||
|
else:
|
||||||
|
assert updated_values == old_values
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_team_accesses_update_administrator_from_owner():
|
||||||
|
"""
|
||||||
|
A user who is an administrator in a team, should not be allowed to update
|
||||||
|
the user access of an "owner" for this team.
|
||||||
|
"""
|
||||||
|
identity = factories.IdentityFactory()
|
||||||
|
user = identity.user
|
||||||
|
jwt_token = OIDCToken.for_user(user)
|
||||||
|
|
||||||
|
team = factories.TeamFactory(users=[(user, "administrator")])
|
||||||
|
other_user = factories.UserFactory()
|
||||||
|
access = factories.TeamAccessFactory(team=team, user=other_user, role="owner")
|
||||||
|
old_values = serializers.TeamAccessSerializer(instance=access).data
|
||||||
|
|
||||||
|
new_values = {
|
||||||
|
"id": uuid4(),
|
||||||
|
"user_id": factories.UserFactory().id,
|
||||||
|
"role": random.choice(models.RoleChoices.choices)[0],
|
||||||
|
}
|
||||||
|
|
||||||
|
api_client = APIClient()
|
||||||
|
for field, value in new_values.items():
|
||||||
|
response = api_client.put(
|
||||||
|
f"/api/v1.0/teams/{team.id!s}/accesses/{access.id!s}/",
|
||||||
|
data={**old_values, field: value},
|
||||||
|
format="json",
|
||||||
|
HTTP_AUTHORIZATION=f"Bearer {jwt_token}",
|
||||||
|
)
|
||||||
|
assert response.status_code == 403
|
||||||
|
access.refresh_from_db()
|
||||||
|
updated_values = serializers.TeamAccessSerializer(instance=access).data
|
||||||
|
assert updated_values == old_values
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_team_accesses_update_administrator_to_owner():
|
||||||
|
"""
|
||||||
|
A user who is an administrator in a team, should not be allowed to update
|
||||||
|
the user access of another user to grant team ownership.
|
||||||
|
"""
|
||||||
|
identity = factories.IdentityFactory()
|
||||||
|
user = identity.user
|
||||||
|
jwt_token = OIDCToken.for_user(user)
|
||||||
|
|
||||||
|
team = factories.TeamFactory(users=[(user, "administrator")])
|
||||||
|
other_user = factories.UserFactory()
|
||||||
|
access = factories.TeamAccessFactory(
|
||||||
|
team=team,
|
||||||
|
user=other_user,
|
||||||
|
role=random.choice(["administrator", "member"]),
|
||||||
|
)
|
||||||
|
old_values = serializers.TeamAccessSerializer(instance=access).data
|
||||||
|
|
||||||
|
new_values = {
|
||||||
|
"id": uuid4(),
|
||||||
|
"user_id": factories.UserFactory().id,
|
||||||
|
"role": "owner",
|
||||||
|
}
|
||||||
|
|
||||||
|
api_client = APIClient()
|
||||||
|
for field, value in new_values.items():
|
||||||
|
new_data = {**old_values, field: value}
|
||||||
|
response = api_client.put(
|
||||||
|
f"/api/v1.0/teams/{team.id!s}/accesses/{access.id!s}/",
|
||||||
|
data=new_data,
|
||||||
|
format="json",
|
||||||
|
HTTP_AUTHORIZATION=f"Bearer {jwt_token}",
|
||||||
|
)
|
||||||
|
# We are not allowed or not really updating the role
|
||||||
|
if field == "role" or new_data["role"] == old_values["role"]:
|
||||||
|
assert response.status_code == 403
|
||||||
|
else:
|
||||||
|
assert response.status_code == 200
|
||||||
|
|
||||||
|
access.refresh_from_db()
|
||||||
|
updated_values = serializers.TeamAccessSerializer(instance=access).data
|
||||||
|
assert updated_values == old_values
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_team_accesses_update_owner_except_owner():
|
||||||
|
"""
|
||||||
|
A user who is an owner in a team should be allowed to update
|
||||||
|
a user access for this team except for existing "owner" accesses.
|
||||||
|
"""
|
||||||
|
identity = factories.IdentityFactory()
|
||||||
|
user = identity.user
|
||||||
|
jwt_token = OIDCToken.for_user(user)
|
||||||
|
|
||||||
|
team = factories.TeamFactory(users=[(user, "owner")])
|
||||||
|
factories.UserFactory()
|
||||||
|
access = factories.TeamAccessFactory(
|
||||||
|
team=team,
|
||||||
|
role=random.choice(["administrator", "member"]),
|
||||||
|
)
|
||||||
|
old_values = serializers.TeamAccessSerializer(instance=access).data
|
||||||
|
|
||||||
|
new_values = {
|
||||||
|
"id": uuid4(),
|
||||||
|
"user_id": factories.UserFactory().id,
|
||||||
|
"role": random.choice(models.RoleChoices.choices)[0],
|
||||||
|
}
|
||||||
|
|
||||||
|
api_client = APIClient()
|
||||||
|
for field, value in new_values.items():
|
||||||
|
new_data = {**old_values, field: value}
|
||||||
|
response = api_client.put(
|
||||||
|
f"/api/v1.0/teams/{team.id!s}/accesses/{access.id!s}/",
|
||||||
|
data=new_data,
|
||||||
|
format="json",
|
||||||
|
HTTP_AUTHORIZATION=f"Bearer {jwt_token}",
|
||||||
|
)
|
||||||
|
|
||||||
|
if (
|
||||||
|
new_data["role"] == old_values["role"]
|
||||||
|
): # we are not really updating the role
|
||||||
|
assert response.status_code == 403
|
||||||
|
else:
|
||||||
|
assert response.status_code == 200
|
||||||
|
|
||||||
|
access.refresh_from_db()
|
||||||
|
updated_values = serializers.TeamAccessSerializer(instance=access).data
|
||||||
|
|
||||||
|
if field == "role":
|
||||||
|
assert updated_values == {**old_values, "role": new_values["role"]}
|
||||||
|
else:
|
||||||
|
assert updated_values == old_values
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_team_accesses_update_owner_for_owners():
|
||||||
|
"""
|
||||||
|
A user who is "owner" of a team should not be allowed to update
|
||||||
|
an existing owner access for this team.
|
||||||
|
"""
|
||||||
|
identity = factories.IdentityFactory()
|
||||||
|
user = identity.user
|
||||||
|
jwt_token = OIDCToken.for_user(user)
|
||||||
|
|
||||||
|
team = factories.TeamFactory(users=[(user, "owner")])
|
||||||
|
access = factories.TeamAccessFactory(team=team, role="owner")
|
||||||
|
old_values = serializers.TeamAccessSerializer(instance=access).data
|
||||||
|
|
||||||
|
new_values = {
|
||||||
|
"id": uuid4(),
|
||||||
|
"user_id": factories.UserFactory().id,
|
||||||
|
"role": random.choice(models.RoleChoices.choices)[0],
|
||||||
|
}
|
||||||
|
|
||||||
|
api_client = APIClient()
|
||||||
|
for field, value in new_values.items():
|
||||||
|
response = api_client.put(
|
||||||
|
f"/api/v1.0/teams/{team.id!s}/accesses/{access.id!s}/",
|
||||||
|
data={**old_values, field: value},
|
||||||
|
content_type="application/json",
|
||||||
|
HTTP_AUTHORIZATION=f"Bearer {jwt_token}",
|
||||||
|
)
|
||||||
|
assert response.status_code == 403
|
||||||
|
access.refresh_from_db()
|
||||||
|
updated_values = serializers.TeamAccessSerializer(instance=access).data
|
||||||
|
assert updated_values == old_values
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_team_accesses_update_owner_self():
|
||||||
|
"""
|
||||||
|
A user who is owner of a team should be allowed to update
|
||||||
|
their own user access provided there are other owners in the team.
|
||||||
|
"""
|
||||||
|
identity = factories.IdentityFactory()
|
||||||
|
user = identity.user
|
||||||
|
jwt_token = OIDCToken.for_user(user)
|
||||||
|
|
||||||
|
team = factories.TeamFactory()
|
||||||
|
access = factories.TeamAccessFactory(team=team, user=user, role="owner")
|
||||||
|
old_values = serializers.TeamAccessSerializer(instance=access).data
|
||||||
|
new_role = random.choice(["administrator", "member"])
|
||||||
|
|
||||||
|
api_client = APIClient()
|
||||||
|
response = api_client.put(
|
||||||
|
f"/api/v1.0/teams/{team.id!s}/accesses/{access.id!s}/",
|
||||||
|
data={**old_values, "role": new_role},
|
||||||
|
format="json",
|
||||||
|
HTTP_AUTHORIZATION=f"Bearer {jwt_token}",
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == 403
|
||||||
|
access.refresh_from_db()
|
||||||
|
assert access.role == "owner"
|
||||||
|
|
||||||
|
# Add another owner and it should now work
|
||||||
|
factories.TeamAccessFactory(team=team, role="owner")
|
||||||
|
|
||||||
|
response = api_client.put(
|
||||||
|
f"/api/v1.0/teams/{team.id!s}/accesses/{access.id!s}/",
|
||||||
|
data={**old_values, "role": new_role},
|
||||||
|
format="json",
|
||||||
|
HTTP_AUTHORIZATION=f"Bearer {jwt_token}",
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == 200
|
||||||
|
access.refresh_from_db()
|
||||||
|
assert access.role == new_role
|
||||||
|
|
||||||
|
|
||||||
|
# Delete
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_team_accesses_delete_anonymous():
|
||||||
|
"""Anonymous users should not be allowed to destroy a team access."""
|
||||||
|
access = factories.TeamAccessFactory()
|
||||||
|
|
||||||
|
response = APIClient().delete(
|
||||||
|
f"/api/v1.0/teams/{access.team.id!s}/accesses/{access.id!s}/",
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == 401
|
||||||
|
assert models.TeamAccess.objects.count() == 1
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_team_accesses_delete_authenticated():
|
||||||
|
"""
|
||||||
|
Authenticated users should not be allowed to delete a team access for a
|
||||||
|
team to which they are not related.
|
||||||
|
"""
|
||||||
|
identity = factories.IdentityFactory()
|
||||||
|
user = identity.user
|
||||||
|
jwt_token = OIDCToken.for_user(user)
|
||||||
|
|
||||||
|
access = factories.TeamAccessFactory()
|
||||||
|
|
||||||
|
response = APIClient().delete(
|
||||||
|
f"/api/v1.0/teams/{access.team.id!s}/accesses/{access.id!s}/",
|
||||||
|
HTTP_AUTHORIZATION=f"Bearer {jwt_token}",
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == 403
|
||||||
|
assert models.TeamAccess.objects.count() == 1
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_team_accesses_delete_member():
|
||||||
|
"""
|
||||||
|
Authenticated users should not be allowed to delete a team access for a
|
||||||
|
team in which they are a simple member.
|
||||||
|
"""
|
||||||
|
identity = factories.IdentityFactory()
|
||||||
|
user = identity.user
|
||||||
|
jwt_token = OIDCToken.for_user(user)
|
||||||
|
|
||||||
|
team = factories.TeamFactory(users=[(user, "member")])
|
||||||
|
access = factories.TeamAccessFactory(team=team)
|
||||||
|
|
||||||
|
assert models.TeamAccess.objects.count() == 2
|
||||||
|
assert models.TeamAccess.objects.filter(user=access.user).exists()
|
||||||
|
|
||||||
|
response = APIClient().delete(
|
||||||
|
f"/api/v1.0/teams/{team.id!s}/accesses/{access.id!s}/",
|
||||||
|
HTTP_AUTHORIZATION=f"Bearer {jwt_token}",
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == 403
|
||||||
|
assert models.TeamAccess.objects.count() == 2
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_team_accesses_delete_administrators():
|
||||||
|
"""
|
||||||
|
Users who are administrators in a team should be allowed to delete an access
|
||||||
|
from the team provided it is not ownership.
|
||||||
|
"""
|
||||||
|
identity = factories.IdentityFactory()
|
||||||
|
user = identity.user
|
||||||
|
jwt_token = OIDCToken.for_user(user)
|
||||||
|
|
||||||
|
team = factories.TeamFactory(users=[(user, "administrator")])
|
||||||
|
access = factories.TeamAccessFactory(
|
||||||
|
team=team, role=random.choice(["member", "administrator"])
|
||||||
|
)
|
||||||
|
|
||||||
|
assert models.TeamAccess.objects.count() == 2
|
||||||
|
assert models.TeamAccess.objects.filter(user=access.user).exists()
|
||||||
|
|
||||||
|
response = APIClient().delete(
|
||||||
|
f"/api/v1.0/teams/{team.id!s}/accesses/{access.id!s}/",
|
||||||
|
HTTP_AUTHORIZATION=f"Bearer {jwt_token}",
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == 204
|
||||||
|
assert models.TeamAccess.objects.count() == 1
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_team_accesses_delete_owners_except_owners():
|
||||||
|
"""
|
||||||
|
Users should be able to delete the team access of another user
|
||||||
|
for a team of which they are owner provided it is not an owner access.
|
||||||
|
"""
|
||||||
|
identity = factories.IdentityFactory()
|
||||||
|
user = identity.user
|
||||||
|
jwt_token = OIDCToken.for_user(user)
|
||||||
|
|
||||||
|
team = factories.TeamFactory(users=[(user, "owner")])
|
||||||
|
access = factories.TeamAccessFactory(
|
||||||
|
team=team, role=random.choice(["member", "administrator"])
|
||||||
|
)
|
||||||
|
|
||||||
|
assert models.TeamAccess.objects.count() == 2
|
||||||
|
assert models.TeamAccess.objects.filter(user=access.user).exists()
|
||||||
|
|
||||||
|
response = APIClient().delete(
|
||||||
|
f"/api/v1.0/teams/{team.id!s}/accesses/{access.id!s}/",
|
||||||
|
HTTP_AUTHORIZATION=f"Bearer {jwt_token}",
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == 204
|
||||||
|
assert models.TeamAccess.objects.count() == 1
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_team_accesses_delete_owners_for_owners():
|
||||||
|
"""
|
||||||
|
Users should not be allowed to delete the team access of another owner
|
||||||
|
even for a team in which they are direct owner.
|
||||||
|
"""
|
||||||
|
identity = factories.IdentityFactory()
|
||||||
|
user = identity.user
|
||||||
|
jwt_token = OIDCToken.for_user(user)
|
||||||
|
|
||||||
|
team = factories.TeamFactory(users=[(user, "owner")])
|
||||||
|
access = factories.TeamAccessFactory(team=team, role="owner")
|
||||||
|
|
||||||
|
assert models.TeamAccess.objects.count() == 2
|
||||||
|
assert models.TeamAccess.objects.filter(user=access.user).exists()
|
||||||
|
|
||||||
|
response = APIClient().delete(
|
||||||
|
f"/api/v1.0/teams/{team.id!s}/accesses/{access.id!s}/",
|
||||||
|
HTTP_AUTHORIZATION=f"Bearer {jwt_token}",
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == 403
|
||||||
|
assert models.TeamAccess.objects.count() == 2
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_team_accesses_delete_owners_last_owner():
|
||||||
|
"""
|
||||||
|
It should not be possible to delete the last owner access from a team
|
||||||
|
"""
|
||||||
|
identity = factories.IdentityFactory()
|
||||||
|
user = identity.user
|
||||||
|
jwt_token = OIDCToken.for_user(user)
|
||||||
|
|
||||||
|
team = factories.TeamFactory()
|
||||||
|
access = factories.TeamAccessFactory(team=team, user=user, role="owner")
|
||||||
|
|
||||||
|
assert models.TeamAccess.objects.count() == 1
|
||||||
|
response = APIClient().delete(
|
||||||
|
f"/api/v1.0/teams/{team.id!s}/accesses/{access.id!s}/",
|
||||||
|
HTTP_AUTHORIZATION=f"Bearer {jwt_token}",
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == 403
|
||||||
|
assert models.TeamAccess.objects.count() == 1
|
||||||
381
src/backend/core/tests/test_api_users.py
Normal file
381
src/backend/core/tests/test_api_users.py
Normal file
@@ -0,0 +1,381 @@
|
|||||||
|
"""
|
||||||
|
Test users API endpoints in the publish core app.
|
||||||
|
"""
|
||||||
|
import pytest
|
||||||
|
from rest_framework.test import APIClient
|
||||||
|
|
||||||
|
from core import factories, models
|
||||||
|
from core.api import serializers
|
||||||
|
|
||||||
|
from .utils import OIDCToken
|
||||||
|
|
||||||
|
pytestmark = pytest.mark.django_db
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_users_list_anonymous():
|
||||||
|
"""Anonymous users should not be allowed to list users."""
|
||||||
|
factories.UserFactory()
|
||||||
|
client = APIClient()
|
||||||
|
response = client.get("/api/v1.0/users/")
|
||||||
|
assert response.status_code == 404
|
||||||
|
assert "Not Found" in response.content.decode("utf-8")
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_users_list_authenticated():
|
||||||
|
"""
|
||||||
|
Authenticated users should not be able to list users.
|
||||||
|
"""
|
||||||
|
identity = factories.IdentityFactory()
|
||||||
|
jwt_token = OIDCToken.for_user(identity.user)
|
||||||
|
|
||||||
|
factories.UserFactory.create_batch(2)
|
||||||
|
response = APIClient().get(
|
||||||
|
"/api/v1.0/users/", HTTP_AUTHORIZATION=f"Bearer {jwt_token}"
|
||||||
|
)
|
||||||
|
assert response.status_code == 404
|
||||||
|
assert "Not Found" in response.content.decode("utf-8")
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_users_retrieve_me_anonymous():
|
||||||
|
"""Anonymous users should not be allowed to list users."""
|
||||||
|
factories.UserFactory.create_batch(2)
|
||||||
|
client = APIClient()
|
||||||
|
response = client.get("/api/v1.0/users/me/")
|
||||||
|
assert response.status_code == 401
|
||||||
|
assert response.json() == {
|
||||||
|
"detail": "Authentication credentials were not provided."
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_users_retrieve_me_authenticated():
|
||||||
|
"""Authenticated users should be able to retrieve their own user via the "/users/me" path."""
|
||||||
|
identity = factories.IdentityFactory()
|
||||||
|
user = identity.user
|
||||||
|
jwt_token = OIDCToken.for_user(user)
|
||||||
|
|
||||||
|
# Define profile contact
|
||||||
|
contact = factories.ContactFactory(owner=user)
|
||||||
|
user.profile_contact = contact
|
||||||
|
user.save()
|
||||||
|
|
||||||
|
factories.UserFactory.create_batch(2)
|
||||||
|
response = APIClient().get(
|
||||||
|
"/api/v1.0/users/me/", HTTP_AUTHORIZATION=f"Bearer {jwt_token}"
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == 200
|
||||||
|
assert response.json() == {
|
||||||
|
"id": str(user.id),
|
||||||
|
"language": user.language,
|
||||||
|
"timezone": str(user.timezone),
|
||||||
|
"is_device": False,
|
||||||
|
"is_staff": False,
|
||||||
|
"data": user.profile_contact.data,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_users_retrieve_anonymous():
|
||||||
|
"""Anonymous users should not be allowed to retrieve a user."""
|
||||||
|
client = APIClient()
|
||||||
|
user = factories.UserFactory()
|
||||||
|
response = client.get(f"/api/v1.0/users/{user.id!s}/")
|
||||||
|
|
||||||
|
assert response.status_code == 401
|
||||||
|
assert response.json() == {
|
||||||
|
"detail": "Authentication credentials were not provided."
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_users_retrieve_authenticated_self():
|
||||||
|
"""
|
||||||
|
Authenticated users should be allowed to retrieve their own user.
|
||||||
|
The returned object should not contain the password.
|
||||||
|
"""
|
||||||
|
identity = factories.IdentityFactory()
|
||||||
|
user = identity.user
|
||||||
|
jwt_token = OIDCToken.for_user(user)
|
||||||
|
|
||||||
|
response = APIClient().get(
|
||||||
|
f"/api/v1.0/users/{user.id!s}/", HTTP_AUTHORIZATION=f"Bearer {jwt_token}"
|
||||||
|
)
|
||||||
|
assert response.status_code == 405
|
||||||
|
assert response.json() == {"detail": 'Method "GET" not allowed.'}
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_users_retrieve_authenticated_other():
|
||||||
|
"""
|
||||||
|
Authenticated users should be able to retrieve another user's detail view with
|
||||||
|
limited information.
|
||||||
|
"""
|
||||||
|
identity = factories.IdentityFactory()
|
||||||
|
jwt_token = OIDCToken.for_user(identity.user)
|
||||||
|
|
||||||
|
other_user = factories.UserFactory()
|
||||||
|
|
||||||
|
response = APIClient().get(
|
||||||
|
f"/api/v1.0/users/{other_user.id!s}/", HTTP_AUTHORIZATION=f"Bearer {jwt_token}"
|
||||||
|
)
|
||||||
|
assert response.status_code == 405
|
||||||
|
assert response.json() == {"detail": 'Method "GET" not allowed.'}
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_users_create_anonymous():
|
||||||
|
"""Anonymous users should not be able to create users via the API."""
|
||||||
|
response = APIClient().post(
|
||||||
|
"/api/v1.0/users/",
|
||||||
|
{
|
||||||
|
"language": "fr-fr",
|
||||||
|
"password": "mypassword",
|
||||||
|
},
|
||||||
|
)
|
||||||
|
assert response.status_code == 404
|
||||||
|
assert "Not Found" in response.content.decode("utf-8")
|
||||||
|
assert models.User.objects.exists() is False
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_users_create_authenticated():
|
||||||
|
"""Authenticated users should not be able to create users via the API."""
|
||||||
|
identity = factories.IdentityFactory()
|
||||||
|
user = identity.user
|
||||||
|
jwt_token = OIDCToken.for_user(user)
|
||||||
|
|
||||||
|
response = APIClient().post(
|
||||||
|
"/api/v1.0/users/",
|
||||||
|
{
|
||||||
|
"language": "fr-fr",
|
||||||
|
"password": "mypassword",
|
||||||
|
},
|
||||||
|
format="json",
|
||||||
|
HTTP_AUTHORIZATION=f"Bearer {jwt_token}",
|
||||||
|
)
|
||||||
|
assert response.status_code == 404
|
||||||
|
assert "Not Found" in response.content.decode("utf-8")
|
||||||
|
assert models.User.objects.exclude(id=user.id).exists() is False
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_users_update_anonymous():
|
||||||
|
"""Anonymous users should not be able to update users via the API."""
|
||||||
|
user = factories.UserFactory()
|
||||||
|
|
||||||
|
old_user_values = dict(serializers.UserSerializer(instance=user).data)
|
||||||
|
new_user_values = serializers.UserSerializer(instance=factories.UserFactory()).data
|
||||||
|
|
||||||
|
response = APIClient().put(
|
||||||
|
f"/api/v1.0/users/{user.id!s}/",
|
||||||
|
new_user_values,
|
||||||
|
format="json",
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == 401
|
||||||
|
assert response.json() == {
|
||||||
|
"detail": "Authentication credentials were not provided."
|
||||||
|
}
|
||||||
|
|
||||||
|
user.refresh_from_db()
|
||||||
|
user_values = dict(serializers.UserSerializer(instance=user).data)
|
||||||
|
for key, value in user_values.items():
|
||||||
|
assert value == old_user_values[key]
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_users_update_authenticated_self():
|
||||||
|
"""
|
||||||
|
Authenticated users should be able to update their own user but only "language"
|
||||||
|
and "timezone" fields.
|
||||||
|
"""
|
||||||
|
identity = factories.IdentityFactory()
|
||||||
|
user = identity.user
|
||||||
|
jwt_token = OIDCToken.for_user(user)
|
||||||
|
|
||||||
|
old_user_values = dict(serializers.UserSerializer(instance=user).data)
|
||||||
|
new_user_values = dict(
|
||||||
|
serializers.UserSerializer(instance=factories.UserFactory()).data
|
||||||
|
)
|
||||||
|
|
||||||
|
response = APIClient().put(
|
||||||
|
f"/api/v1.0/users/{user.id!s}/",
|
||||||
|
new_user_values,
|
||||||
|
format="json",
|
||||||
|
HTTP_AUTHORIZATION=f"Bearer {jwt_token}",
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == 200
|
||||||
|
user.refresh_from_db()
|
||||||
|
user_values = dict(serializers.UserSerializer(instance=user).data)
|
||||||
|
for key, value in user_values.items():
|
||||||
|
if key in ["language", "timezone"]:
|
||||||
|
assert value == new_user_values[key]
|
||||||
|
else:
|
||||||
|
assert value == old_user_values[key]
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_users_update_authenticated_other():
|
||||||
|
"""Authenticated users should not be allowed to update other users."""
|
||||||
|
identity = factories.IdentityFactory()
|
||||||
|
jwt_token = OIDCToken.for_user(identity.user)
|
||||||
|
|
||||||
|
user = factories.UserFactory()
|
||||||
|
old_user_values = dict(serializers.UserSerializer(instance=user).data)
|
||||||
|
new_user_values = serializers.UserSerializer(instance=factories.UserFactory()).data
|
||||||
|
|
||||||
|
response = APIClient().put(
|
||||||
|
f"/api/v1.0/users/{user.id!s}/",
|
||||||
|
new_user_values,
|
||||||
|
format="json",
|
||||||
|
HTTP_AUTHORIZATION=f"Bearer {jwt_token}",
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == 403
|
||||||
|
user.refresh_from_db()
|
||||||
|
user_values = dict(serializers.UserSerializer(instance=user).data)
|
||||||
|
for key, value in user_values.items():
|
||||||
|
assert value == old_user_values[key]
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_users_patch_anonymous():
|
||||||
|
"""Anonymous users should not be able to patch users via the API."""
|
||||||
|
user = factories.UserFactory()
|
||||||
|
|
||||||
|
old_user_values = dict(serializers.UserSerializer(instance=user).data)
|
||||||
|
new_user_values = dict(
|
||||||
|
serializers.UserSerializer(instance=factories.UserFactory()).data
|
||||||
|
)
|
||||||
|
|
||||||
|
for key, new_value in new_user_values.items():
|
||||||
|
response = APIClient().patch(
|
||||||
|
f"/api/v1.0/users/{user.id!s}/",
|
||||||
|
{key: new_value},
|
||||||
|
format="json",
|
||||||
|
)
|
||||||
|
assert response.status_code == 401
|
||||||
|
assert response.json() == {
|
||||||
|
"detail": "Authentication credentials were not provided."
|
||||||
|
}
|
||||||
|
|
||||||
|
user.refresh_from_db()
|
||||||
|
user_values = dict(serializers.UserSerializer(instance=user).data)
|
||||||
|
for key, value in user_values.items():
|
||||||
|
assert value == old_user_values[key]
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_users_patch_authenticated_self():
|
||||||
|
"""
|
||||||
|
Authenticated users should be able to patch their own user but only "language"
|
||||||
|
and "timezone" fields.
|
||||||
|
"""
|
||||||
|
identity = factories.IdentityFactory()
|
||||||
|
user = identity.user
|
||||||
|
jwt_token = OIDCToken.for_user(user)
|
||||||
|
|
||||||
|
old_user_values = dict(serializers.UserSerializer(instance=user).data)
|
||||||
|
new_user_values = dict(
|
||||||
|
serializers.UserSerializer(instance=factories.UserFactory()).data
|
||||||
|
)
|
||||||
|
|
||||||
|
for key, new_value in new_user_values.items():
|
||||||
|
response = APIClient().patch(
|
||||||
|
f"/api/v1.0/users/{user.id!s}/",
|
||||||
|
{key: new_value},
|
||||||
|
format="json",
|
||||||
|
HTTP_AUTHORIZATION=f"Bearer {jwt_token}",
|
||||||
|
)
|
||||||
|
assert response.status_code == 200
|
||||||
|
|
||||||
|
user.refresh_from_db()
|
||||||
|
user_values = dict(serializers.UserSerializer(instance=user).data)
|
||||||
|
for key, value in user_values.items():
|
||||||
|
if key in ["language", "timezone"]:
|
||||||
|
assert value == new_user_values[key]
|
||||||
|
else:
|
||||||
|
assert value == old_user_values[key]
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_users_patch_authenticated_other():
|
||||||
|
"""Authenticated users should not be allowed to patch other users."""
|
||||||
|
identity = factories.IdentityFactory()
|
||||||
|
jwt_token = OIDCToken.for_user(identity.user)
|
||||||
|
|
||||||
|
user = factories.UserFactory()
|
||||||
|
old_user_values = dict(serializers.UserSerializer(instance=user).data)
|
||||||
|
new_user_values = dict(
|
||||||
|
serializers.UserSerializer(instance=factories.UserFactory()).data
|
||||||
|
)
|
||||||
|
|
||||||
|
for key, new_value in new_user_values.items():
|
||||||
|
response = APIClient().put(
|
||||||
|
f"/api/v1.0/users/{user.id!s}/",
|
||||||
|
{key: new_value},
|
||||||
|
format="json",
|
||||||
|
HTTP_AUTHORIZATION=f"Bearer {jwt_token}",
|
||||||
|
)
|
||||||
|
assert response.status_code == 403
|
||||||
|
|
||||||
|
user.refresh_from_db()
|
||||||
|
user_values = dict(serializers.UserSerializer(instance=user).data)
|
||||||
|
for key, value in user_values.items():
|
||||||
|
assert value == old_user_values[key]
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_users_delete_list_anonymous():
|
||||||
|
"""Anonymous users should not be allowed to delete a list of users."""
|
||||||
|
factories.UserFactory.create_batch(2)
|
||||||
|
|
||||||
|
client = APIClient()
|
||||||
|
response = client.delete("/api/v1.0/users/")
|
||||||
|
|
||||||
|
assert response.status_code == 404
|
||||||
|
assert models.User.objects.count() == 2
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_users_delete_list_authenticated():
|
||||||
|
"""Authenticated users should not be allowed to delete a list of users."""
|
||||||
|
factories.UserFactory.create_batch(2)
|
||||||
|
identity = factories.IdentityFactory()
|
||||||
|
jwt_token = OIDCToken.for_user(identity.user)
|
||||||
|
|
||||||
|
client = APIClient()
|
||||||
|
response = client.delete(
|
||||||
|
"/api/v1.0/users/", HTTP_AUTHORIZATION=f"Bearer {jwt_token}"
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == 404
|
||||||
|
assert models.User.objects.count() == 3
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_users_delete_anonymous():
|
||||||
|
"""Anonymous users should not be allowed to delete a user."""
|
||||||
|
user = factories.UserFactory()
|
||||||
|
|
||||||
|
response = APIClient().delete(f"/api/v1.0/users/{user.id!s}/")
|
||||||
|
|
||||||
|
assert response.status_code == 401
|
||||||
|
assert models.User.objects.count() == 1
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_users_delete_authenticated():
|
||||||
|
"""
|
||||||
|
Authenticated users should not be allowed to delete a user other than themselves.
|
||||||
|
"""
|
||||||
|
identity = factories.IdentityFactory()
|
||||||
|
jwt_token = OIDCToken.for_user(identity.user)
|
||||||
|
other_user = factories.UserFactory()
|
||||||
|
|
||||||
|
response = APIClient().delete(
|
||||||
|
f"/api/v1.0/users/{other_user.id!s}/", HTTP_AUTHORIZATION=f"Bearer {jwt_token}"
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == 405
|
||||||
|
assert models.User.objects.count() == 2
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_users_delete_self():
|
||||||
|
"""Authenticated users should not be able to delete their own user."""
|
||||||
|
identity = factories.IdentityFactory()
|
||||||
|
jwt_token = OIDCToken.for_user(identity.user)
|
||||||
|
|
||||||
|
response = APIClient().delete(
|
||||||
|
f"/api/v1.0/users/{identity.user.id!s}/",
|
||||||
|
HTTP_AUTHORIZATION=f"Bearer {jwt_token}",
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == 405
|
||||||
|
assert models.User.objects.count() == 1
|
||||||
183
src/backend/core/tests/test_models_identities.py
Normal file
183
src/backend/core/tests/test_models_identities.py
Normal file
@@ -0,0 +1,183 @@
|
|||||||
|
"""
|
||||||
|
Unit tests for the Identity model
|
||||||
|
"""
|
||||||
|
from django.core.exceptions import ValidationError
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
|
||||||
|
from core import factories, models
|
||||||
|
|
||||||
|
pytestmark = pytest.mark.django_db
|
||||||
|
|
||||||
|
|
||||||
|
def test_models_identities_str_main():
|
||||||
|
"""The str representation should be the email address with indication that it is main."""
|
||||||
|
identity = factories.IdentityFactory(email="david@example.com")
|
||||||
|
assert str(identity) == "david@example.com[main]"
|
||||||
|
|
||||||
|
|
||||||
|
def test_models_identities_str_secondary():
|
||||||
|
"""The str representation of a secondary email should be the email address."""
|
||||||
|
main_identity = factories.IdentityFactory()
|
||||||
|
secondary_identity = factories.IdentityFactory(
|
||||||
|
user=main_identity.user, email="david@example.com"
|
||||||
|
)
|
||||||
|
assert str(secondary_identity) == "david@example.com"
|
||||||
|
|
||||||
|
|
||||||
|
def test_models_identities_is_main_automatic():
|
||||||
|
"""The first identity created for a user should automatically be set as main."""
|
||||||
|
user = factories.UserFactory()
|
||||||
|
identity = models.Identity.objects.create(
|
||||||
|
user=user, sub="123", email="david@example.com"
|
||||||
|
)
|
||||||
|
assert identity.is_main is True
|
||||||
|
|
||||||
|
|
||||||
|
def test_models_identities_is_main_exists():
|
||||||
|
"""A user should always keep one and only one of its identities as main."""
|
||||||
|
user = factories.UserFactory()
|
||||||
|
main_identity, _secondary_identity = factories.IdentityFactory.create_batch(
|
||||||
|
2, user=user
|
||||||
|
)
|
||||||
|
|
||||||
|
assert main_identity.is_main is True
|
||||||
|
|
||||||
|
main_identity.is_main = False
|
||||||
|
with pytest.raises(
|
||||||
|
ValidationError, match="A user should have one and only one main identity."
|
||||||
|
):
|
||||||
|
main_identity.save()
|
||||||
|
|
||||||
|
|
||||||
|
def test_models_identities_is_main_switch():
|
||||||
|
"""Setting a secondary identity as main should reset the existing main identity."""
|
||||||
|
user = factories.UserFactory()
|
||||||
|
first_identity, second_identity = factories.IdentityFactory.create_batch(
|
||||||
|
2, user=user
|
||||||
|
)
|
||||||
|
|
||||||
|
assert first_identity.is_main is True
|
||||||
|
|
||||||
|
second_identity.is_main = True
|
||||||
|
second_identity.save()
|
||||||
|
|
||||||
|
second_identity.refresh_from_db()
|
||||||
|
assert second_identity.is_main is True
|
||||||
|
|
||||||
|
first_identity.refresh_from_db()
|
||||||
|
assert first_identity.is_main is False
|
||||||
|
|
||||||
|
|
||||||
|
def test_models_identities_email_required():
|
||||||
|
"""The "email" field is required."""
|
||||||
|
user = factories.UserFactory()
|
||||||
|
with pytest.raises(ValidationError, match="This field cannot be null."):
|
||||||
|
models.Identity.objects.create(user=user, email=None)
|
||||||
|
|
||||||
|
|
||||||
|
def test_models_identities_user_required():
|
||||||
|
"""The "user" field is required."""
|
||||||
|
with pytest.raises(models.User.DoesNotExist, match="Identity has no user."):
|
||||||
|
models.Identity.objects.create(user=None, email="david@example.com")
|
||||||
|
|
||||||
|
|
||||||
|
def test_models_identities_email_unique_same_user():
|
||||||
|
"""The "email" field should be unique for a given user."""
|
||||||
|
email = factories.IdentityFactory()
|
||||||
|
|
||||||
|
with pytest.raises(
|
||||||
|
ValidationError,
|
||||||
|
match="Identity with this User and Email address already exists.",
|
||||||
|
):
|
||||||
|
factories.IdentityFactory(user=email.user, email=email.email)
|
||||||
|
|
||||||
|
|
||||||
|
def test_models_identities_email_unique_different_users():
|
||||||
|
"""The "email" field should not be unique among users."""
|
||||||
|
email = factories.IdentityFactory()
|
||||||
|
factories.IdentityFactory(email=email.email)
|
||||||
|
|
||||||
|
|
||||||
|
def test_models_identities_email_normalization():
|
||||||
|
"""The email field should be automatically normalized upon saving."""
|
||||||
|
email = factories.IdentityFactory()
|
||||||
|
email.email = "Thomas.Jefferson@Example.com"
|
||||||
|
email.save()
|
||||||
|
assert email.email == "Thomas.Jefferson@example.com"
|
||||||
|
|
||||||
|
|
||||||
|
def test_models_identities_ordering():
|
||||||
|
"""Identitys should be returned ordered by main status then by their email address."""
|
||||||
|
user = factories.UserFactory()
|
||||||
|
factories.IdentityFactory.create_batch(5, user=user)
|
||||||
|
|
||||||
|
emails = models.Identity.objects.all()
|
||||||
|
|
||||||
|
assert emails[0].is_main is True
|
||||||
|
for i in range(3):
|
||||||
|
assert emails[i + 1].is_main is False
|
||||||
|
assert emails[i + 2].email >= emails[i + 1].email
|
||||||
|
|
||||||
|
|
||||||
|
def test_models_identities_sub_null():
|
||||||
|
"""The "sub" field should not be null."""
|
||||||
|
user = factories.UserFactory()
|
||||||
|
with pytest.raises(ValidationError, match="This field cannot be null."):
|
||||||
|
models.Identity.objects.create(user=user, sub=None)
|
||||||
|
|
||||||
|
|
||||||
|
def test_models_identities_sub_blank():
|
||||||
|
"""The "sub" field should not be blank."""
|
||||||
|
user = factories.UserFactory()
|
||||||
|
with pytest.raises(ValidationError, match="This field cannot be blank."):
|
||||||
|
models.Identity.objects.create(user=user, email="david@example.com", sub="")
|
||||||
|
|
||||||
|
|
||||||
|
def test_models_identities_sub_unique():
|
||||||
|
"""The "sub" field should be unique."""
|
||||||
|
user = factories.UserFactory()
|
||||||
|
identity = factories.IdentityFactory()
|
||||||
|
with pytest.raises(ValidationError, match="Identity with this Sub already exists."):
|
||||||
|
models.Identity.objects.create(user=user, sub=identity.sub)
|
||||||
|
|
||||||
|
|
||||||
|
def test_models_identities_sub_max_length():
|
||||||
|
"""The sub field should be 255 characters maximum."""
|
||||||
|
factories.IdentityFactory(sub="a" * 255)
|
||||||
|
with pytest.raises(ValidationError) as excinfo:
|
||||||
|
factories.IdentityFactory(sub="a" * 256)
|
||||||
|
|
||||||
|
assert (
|
||||||
|
str(excinfo.value)
|
||||||
|
== "{'sub': ['Ensure this value has at most 255 characters (it has 256).']}"
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def test_models_identities_sub_special_characters():
|
||||||
|
"""The sub field should accept periods, dashes, +, @ and underscores."""
|
||||||
|
identity = factories.IdentityFactory(sub="dave.bowman-1+2@hal_9000")
|
||||||
|
assert identity.sub == "dave.bowman-1+2@hal_9000"
|
||||||
|
|
||||||
|
|
||||||
|
def test_models_identities_sub_spaces():
|
||||||
|
"""The sub field should not accept spaces."""
|
||||||
|
with pytest.raises(ValidationError) as excinfo:
|
||||||
|
factories.IdentityFactory(sub="a b")
|
||||||
|
|
||||||
|
assert str(excinfo.value) == (
|
||||||
|
"{'sub': ['Enter a valid sub. This value may contain only letters, numbers, "
|
||||||
|
"and @/./+/-/_ characters.']}"
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def test_models_identities_sub_upper_case():
|
||||||
|
"""The sub field should accept upper case characters."""
|
||||||
|
identity = factories.IdentityFactory(sub="John")
|
||||||
|
assert identity.sub == "John"
|
||||||
|
|
||||||
|
|
||||||
|
def test_models_identities_sub_ascii():
|
||||||
|
"""The sub field should accept non ASCII letters."""
|
||||||
|
identity = factories.IdentityFactory(sub="rené")
|
||||||
|
assert identity.sub == "rené"
|
||||||
264
src/backend/core/tests/test_models_team_accesses.py
Normal file
264
src/backend/core/tests/test_models_team_accesses.py
Normal file
@@ -0,0 +1,264 @@
|
|||||||
|
"""
|
||||||
|
Unit tests for the TeamAccess model
|
||||||
|
"""
|
||||||
|
from django.contrib.auth.models import AnonymousUser
|
||||||
|
from django.core.exceptions import ValidationError
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
|
||||||
|
from core import factories
|
||||||
|
|
||||||
|
pytestmark = pytest.mark.django_db
|
||||||
|
|
||||||
|
|
||||||
|
def test_models_team_accesses_str():
|
||||||
|
"""
|
||||||
|
The str representation should include user name, team full name and role.
|
||||||
|
"""
|
||||||
|
contact = factories.ContactFactory(full_name="David Bowman")
|
||||||
|
user = contact.owner
|
||||||
|
user.profile_contact = contact
|
||||||
|
user.save()
|
||||||
|
access = factories.TeamAccessFactory(
|
||||||
|
role="member",
|
||||||
|
user=user,
|
||||||
|
team__name="admins",
|
||||||
|
)
|
||||||
|
assert str(access) == "David Bowman is member in team admins"
|
||||||
|
|
||||||
|
|
||||||
|
def test_models_team_accesses_unique():
|
||||||
|
"""Team accesses should be unique for a given couple of user and team."""
|
||||||
|
access = factories.TeamAccessFactory()
|
||||||
|
|
||||||
|
with pytest.raises(
|
||||||
|
ValidationError,
|
||||||
|
match="Team/user relation with this User and Team already exists.",
|
||||||
|
):
|
||||||
|
factories.TeamAccessFactory(user=access.user, team=access.team)
|
||||||
|
|
||||||
|
|
||||||
|
# get_abilities
|
||||||
|
|
||||||
|
|
||||||
|
def test_models_team_access_get_abilities_anonymous():
|
||||||
|
"""Check abilities returned for an anonymous user."""
|
||||||
|
access = factories.TeamAccessFactory()
|
||||||
|
abilities = access.get_abilities(AnonymousUser())
|
||||||
|
assert abilities == {
|
||||||
|
"delete": False,
|
||||||
|
"get": False,
|
||||||
|
"patch": False,
|
||||||
|
"put": False,
|
||||||
|
"set_role_to": [],
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def test_models_team_access_get_abilities_authenticated():
|
||||||
|
"""Check abilities returned for an authenticated user."""
|
||||||
|
access = factories.TeamAccessFactory()
|
||||||
|
user = factories.UserFactory()
|
||||||
|
abilities = access.get_abilities(user)
|
||||||
|
assert abilities == {
|
||||||
|
"delete": False,
|
||||||
|
"get": False,
|
||||||
|
"patch": False,
|
||||||
|
"put": False,
|
||||||
|
"set_role_to": [],
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
# - for owner
|
||||||
|
|
||||||
|
|
||||||
|
def test_models_team_access_get_abilities_for_owner_of_self_allowed():
|
||||||
|
"""
|
||||||
|
Check abilities of self access for the owner of a team when there is more than one user left.
|
||||||
|
"""
|
||||||
|
access = factories.TeamAccessFactory(role="owner")
|
||||||
|
factories.TeamAccessFactory(team=access.team, role="owner")
|
||||||
|
abilities = access.get_abilities(access.user)
|
||||||
|
assert abilities == {
|
||||||
|
"delete": True,
|
||||||
|
"get": True,
|
||||||
|
"patch": True,
|
||||||
|
"put": True,
|
||||||
|
"set_role_to": ["administrator", "member"],
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def test_models_team_access_get_abilities_for_owner_of_self_last():
|
||||||
|
"""Check abilities of self access for the owner of a team when there is only one owner left."""
|
||||||
|
access = factories.TeamAccessFactory(role="owner")
|
||||||
|
abilities = access.get_abilities(access.user)
|
||||||
|
assert abilities == {
|
||||||
|
"delete": False,
|
||||||
|
"get": True,
|
||||||
|
"patch": False,
|
||||||
|
"put": False,
|
||||||
|
"set_role_to": [],
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def test_models_team_access_get_abilities_for_owner_of_owner():
|
||||||
|
"""Check abilities of owner access for the owner of a team."""
|
||||||
|
access = factories.TeamAccessFactory(role="owner")
|
||||||
|
factories.TeamAccessFactory(team=access.team) # another one
|
||||||
|
user = factories.TeamAccessFactory(team=access.team, role="owner").user
|
||||||
|
abilities = access.get_abilities(user)
|
||||||
|
assert abilities == {
|
||||||
|
"delete": False,
|
||||||
|
"get": True,
|
||||||
|
"patch": False,
|
||||||
|
"put": False,
|
||||||
|
"set_role_to": [],
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def test_models_team_access_get_abilities_for_owner_of_administrator():
|
||||||
|
"""Check abilities of administrator access for the owner of a team."""
|
||||||
|
access = factories.TeamAccessFactory(role="administrator")
|
||||||
|
factories.TeamAccessFactory(team=access.team) # another one
|
||||||
|
user = factories.TeamAccessFactory(team=access.team, role="owner").user
|
||||||
|
abilities = access.get_abilities(user)
|
||||||
|
assert abilities == {
|
||||||
|
"delete": True,
|
||||||
|
"get": True,
|
||||||
|
"patch": True,
|
||||||
|
"put": True,
|
||||||
|
"set_role_to": ["owner", "member"],
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def test_models_team_access_get_abilities_for_owner_of_member():
|
||||||
|
"""Check abilities of member access for the owner of a team."""
|
||||||
|
access = factories.TeamAccessFactory(role="member")
|
||||||
|
factories.TeamAccessFactory(team=access.team) # another one
|
||||||
|
user = factories.TeamAccessFactory(team=access.team, role="owner").user
|
||||||
|
abilities = access.get_abilities(user)
|
||||||
|
assert abilities == {
|
||||||
|
"delete": True,
|
||||||
|
"get": True,
|
||||||
|
"patch": True,
|
||||||
|
"put": True,
|
||||||
|
"set_role_to": ["owner", "administrator"],
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
# - for administrator
|
||||||
|
|
||||||
|
|
||||||
|
def test_models_team_access_get_abilities_for_administrator_of_owner():
|
||||||
|
"""Check abilities of owner access for the administrator of a team."""
|
||||||
|
access = factories.TeamAccessFactory(role="owner")
|
||||||
|
factories.TeamAccessFactory(team=access.team) # another one
|
||||||
|
user = factories.TeamAccessFactory(team=access.team, role="administrator").user
|
||||||
|
abilities = access.get_abilities(user)
|
||||||
|
assert abilities == {
|
||||||
|
"delete": False,
|
||||||
|
"get": True,
|
||||||
|
"patch": False,
|
||||||
|
"put": False,
|
||||||
|
"set_role_to": [],
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def test_models_team_access_get_abilities_for_administrator_of_administrator():
|
||||||
|
"""Check abilities of administrator access for the administrator of a team."""
|
||||||
|
access = factories.TeamAccessFactory(role="administrator")
|
||||||
|
factories.TeamAccessFactory(team=access.team) # another one
|
||||||
|
user = factories.TeamAccessFactory(team=access.team, role="administrator").user
|
||||||
|
abilities = access.get_abilities(user)
|
||||||
|
assert abilities == {
|
||||||
|
"delete": True,
|
||||||
|
"get": True,
|
||||||
|
"patch": True,
|
||||||
|
"put": True,
|
||||||
|
"set_role_to": ["member"],
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def test_models_team_access_get_abilities_for_administrator_of_member():
|
||||||
|
"""Check abilities of member access for the administrator of a team."""
|
||||||
|
access = factories.TeamAccessFactory(role="member")
|
||||||
|
factories.TeamAccessFactory(team=access.team) # another one
|
||||||
|
user = factories.TeamAccessFactory(team=access.team, role="administrator").user
|
||||||
|
abilities = access.get_abilities(user)
|
||||||
|
assert abilities == {
|
||||||
|
"delete": True,
|
||||||
|
"get": True,
|
||||||
|
"patch": True,
|
||||||
|
"put": True,
|
||||||
|
"set_role_to": ["administrator"],
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
# - for member
|
||||||
|
|
||||||
|
|
||||||
|
def test_models_team_access_get_abilities_for_member_of_owner():
|
||||||
|
"""Check abilities of owner access for the member of a team."""
|
||||||
|
access = factories.TeamAccessFactory(role="owner")
|
||||||
|
factories.TeamAccessFactory(team=access.team) # another one
|
||||||
|
user = factories.TeamAccessFactory(team=access.team, role="member").user
|
||||||
|
abilities = access.get_abilities(user)
|
||||||
|
assert abilities == {
|
||||||
|
"delete": False,
|
||||||
|
"get": True,
|
||||||
|
"patch": False,
|
||||||
|
"put": False,
|
||||||
|
"set_role_to": [],
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def test_models_team_access_get_abilities_for_member_of_administrator():
|
||||||
|
"""Check abilities of administrator access for the member of a team."""
|
||||||
|
access = factories.TeamAccessFactory(role="administrator")
|
||||||
|
factories.TeamAccessFactory(team=access.team) # another one
|
||||||
|
user = factories.TeamAccessFactory(team=access.team, role="member").user
|
||||||
|
abilities = access.get_abilities(user)
|
||||||
|
assert abilities == {
|
||||||
|
"delete": False,
|
||||||
|
"get": True,
|
||||||
|
"patch": False,
|
||||||
|
"put": False,
|
||||||
|
"set_role_to": [],
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def test_models_team_access_get_abilities_for_member_of_member_user(
|
||||||
|
django_assert_num_queries
|
||||||
|
):
|
||||||
|
"""Check abilities of member access for the member of a team."""
|
||||||
|
access = factories.TeamAccessFactory(role="member")
|
||||||
|
factories.TeamAccessFactory(team=access.team) # another one
|
||||||
|
user = factories.TeamAccessFactory(team=access.team, role="member").user
|
||||||
|
|
||||||
|
with django_assert_num_queries(1):
|
||||||
|
abilities = access.get_abilities(user)
|
||||||
|
|
||||||
|
assert abilities == {
|
||||||
|
"delete": False,
|
||||||
|
"get": True,
|
||||||
|
"patch": False,
|
||||||
|
"put": False,
|
||||||
|
"set_role_to": [],
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def test_models_team_access_get_abilities_preset_role(django_assert_num_queries):
|
||||||
|
"""No query is done if the role is preset, e.g., with a query annotation."""
|
||||||
|
access = factories.TeamAccessFactory(role="member")
|
||||||
|
user = factories.TeamAccessFactory(team=access.team, role="member").user
|
||||||
|
access.user_role = "member"
|
||||||
|
|
||||||
|
with django_assert_num_queries(0):
|
||||||
|
abilities = access.get_abilities(user)
|
||||||
|
|
||||||
|
assert abilities == {
|
||||||
|
"delete": False,
|
||||||
|
"get": True,
|
||||||
|
"patch": False,
|
||||||
|
"put": False,
|
||||||
|
"set_role_to": [],
|
||||||
|
}
|
||||||
135
src/backend/core/tests/test_models_teams.py
Normal file
135
src/backend/core/tests/test_models_teams.py
Normal file
@@ -0,0 +1,135 @@
|
|||||||
|
"""
|
||||||
|
Unit tests for the Team model
|
||||||
|
"""
|
||||||
|
from django.contrib.auth.models import AnonymousUser
|
||||||
|
from django.core.exceptions import ValidationError
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
|
||||||
|
from core import factories, models
|
||||||
|
|
||||||
|
pytestmark = pytest.mark.django_db
|
||||||
|
|
||||||
|
|
||||||
|
def test_models_teams_str():
|
||||||
|
"""The str representation should be the name of the team."""
|
||||||
|
team = factories.TeamFactory(name="admins")
|
||||||
|
assert str(team) == "admins"
|
||||||
|
|
||||||
|
|
||||||
|
def test_models_teams_id_unique():
|
||||||
|
"""The "id" field should be unique."""
|
||||||
|
team = factories.TeamFactory()
|
||||||
|
with pytest.raises(ValidationError, match="Team with this Id already exists."):
|
||||||
|
factories.TeamFactory(id=team.id)
|
||||||
|
|
||||||
|
|
||||||
|
def test_models_teams_name_null():
|
||||||
|
"""The "name" field should not be null."""
|
||||||
|
with pytest.raises(ValidationError, match="This field cannot be null."):
|
||||||
|
models.Team.objects.create(name=None)
|
||||||
|
|
||||||
|
|
||||||
|
def test_models_teams_name_empty():
|
||||||
|
"""The "name" field should not be empty."""
|
||||||
|
with pytest.raises(ValidationError, match="This field cannot be blank."):
|
||||||
|
models.Team.objects.create(name="")
|
||||||
|
|
||||||
|
|
||||||
|
def test_models_teams_name_max_length():
|
||||||
|
"""The "name" field should be 100 characters maximum."""
|
||||||
|
factories.TeamFactory(name="a " * 50)
|
||||||
|
with pytest.raises(
|
||||||
|
ValidationError,
|
||||||
|
match=r"Ensure this value has at most 100 characters \(it has 102\)\.",
|
||||||
|
):
|
||||||
|
factories.TeamFactory(name="a " * 51)
|
||||||
|
|
||||||
|
|
||||||
|
# get_abilities
|
||||||
|
|
||||||
|
|
||||||
|
def test_models_teams_get_abilities_anonymous():
|
||||||
|
"""Check abilities returned for an anonymous user."""
|
||||||
|
team = factories.TeamFactory()
|
||||||
|
abilities = team.get_abilities(AnonymousUser())
|
||||||
|
assert abilities == {
|
||||||
|
"delete": False,
|
||||||
|
"get": True,
|
||||||
|
"patch": False,
|
||||||
|
"put": False,
|
||||||
|
"manage_accesses": False,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def test_models_teams_get_abilities_authenticated():
|
||||||
|
"""Check abilities returned for an authenticated user."""
|
||||||
|
team = factories.TeamFactory()
|
||||||
|
abilities = team.get_abilities(factories.UserFactory())
|
||||||
|
assert abilities == {
|
||||||
|
"delete": False,
|
||||||
|
"get": True,
|
||||||
|
"patch": False,
|
||||||
|
"put": False,
|
||||||
|
"manage_accesses": False,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def test_models_teams_get_abilities_owner():
|
||||||
|
"""Check abilities returned for the owner of a team."""
|
||||||
|
user = factories.UserFactory()
|
||||||
|
access = factories.TeamAccessFactory(role="owner", user=user)
|
||||||
|
abilities = access.team.get_abilities(access.user)
|
||||||
|
assert abilities == {
|
||||||
|
"delete": True,
|
||||||
|
"get": True,
|
||||||
|
"patch": True,
|
||||||
|
"put": True,
|
||||||
|
"manage_accesses": True,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def test_models_teams_get_abilities_administrator():
|
||||||
|
"""Check abilities returned for the administrator of a team."""
|
||||||
|
access = factories.TeamAccessFactory(role="administrator")
|
||||||
|
abilities = access.team.get_abilities(access.user)
|
||||||
|
assert abilities == {
|
||||||
|
"delete": False,
|
||||||
|
"get": True,
|
||||||
|
"patch": True,
|
||||||
|
"put": True,
|
||||||
|
"manage_accesses": True,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def test_models_teams_get_abilities_member_user(django_assert_num_queries):
|
||||||
|
"""Check abilities returned for the member of a team."""
|
||||||
|
access = factories.TeamAccessFactory(role="member")
|
||||||
|
|
||||||
|
with django_assert_num_queries(1):
|
||||||
|
abilities = access.team.get_abilities(access.user)
|
||||||
|
|
||||||
|
assert abilities == {
|
||||||
|
"delete": False,
|
||||||
|
"get": True,
|
||||||
|
"patch": False,
|
||||||
|
"put": False,
|
||||||
|
"manage_accesses": False,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def test_models_teams_get_abilities_preset_role(django_assert_num_queries):
|
||||||
|
"""No query is done if the role is preset e.g. with query annotation."""
|
||||||
|
access = factories.TeamAccessFactory(role="member")
|
||||||
|
access.team.user_role = "member"
|
||||||
|
|
||||||
|
with django_assert_num_queries(0):
|
||||||
|
abilities = access.team.get_abilities(access.user)
|
||||||
|
|
||||||
|
assert abilities == {
|
||||||
|
"delete": False,
|
||||||
|
"get": True,
|
||||||
|
"patch": False,
|
||||||
|
"put": False,
|
||||||
|
"manage_accesses": False,
|
||||||
|
}
|
||||||
83
src/backend/core/tests/test_models_users.py
Normal file
83
src/backend/core/tests/test_models_users.py
Normal file
@@ -0,0 +1,83 @@
|
|||||||
|
"""
|
||||||
|
Unit tests for the User model
|
||||||
|
"""
|
||||||
|
from unittest import mock
|
||||||
|
|
||||||
|
from django.core.exceptions import ValidationError
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
|
||||||
|
from core import factories, models
|
||||||
|
|
||||||
|
pytestmark = pytest.mark.django_db
|
||||||
|
|
||||||
|
|
||||||
|
def test_models_users_str():
|
||||||
|
"""The str representation should be the full name."""
|
||||||
|
user = factories.UserFactory()
|
||||||
|
contact = factories.ContactFactory(full_name="david bowman", owner=user)
|
||||||
|
user.profile_contact = contact
|
||||||
|
user.save()
|
||||||
|
|
||||||
|
assert str(user) == "david bowman"
|
||||||
|
|
||||||
|
|
||||||
|
def test_models_users_id_unique():
|
||||||
|
"""The "id" field should be unique."""
|
||||||
|
user = factories.UserFactory()
|
||||||
|
with pytest.raises(ValidationError, match="User with this Id already exists."):
|
||||||
|
factories.UserFactory(id=user.id)
|
||||||
|
|
||||||
|
|
||||||
|
def test_models_users_profile_not_owned():
|
||||||
|
"""A user cannot declare as profile a contact that not is owned."""
|
||||||
|
user = factories.UserFactory()
|
||||||
|
contact = factories.ContactFactory(base=None, owner=None)
|
||||||
|
|
||||||
|
user.profile_contact = contact
|
||||||
|
with pytest.raises(ValidationError) as excinfo:
|
||||||
|
user.save()
|
||||||
|
|
||||||
|
assert (
|
||||||
|
str(excinfo.value)
|
||||||
|
== "{'__all__': ['Users can only declare as profile a contact they own.']}"
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def test_models_users_profile_owned_by_other():
|
||||||
|
"""A user cannot declare as profile a contact that is owned by another user."""
|
||||||
|
user = factories.UserFactory()
|
||||||
|
contact = factories.ContactFactory()
|
||||||
|
|
||||||
|
user.profile_contact = contact
|
||||||
|
with pytest.raises(ValidationError) as excinfo:
|
||||||
|
user.save()
|
||||||
|
|
||||||
|
assert (
|
||||||
|
str(excinfo.value)
|
||||||
|
== "{'__all__': ['Users can only declare as profile a contact they own.']}"
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def test_models_users_send_mail_main_existing():
|
||||||
|
"""The "email_user' method should send mail to the user's main email address."""
|
||||||
|
main_email = factories.IdentityFactory(email="dave@example.com")
|
||||||
|
user = main_email.user
|
||||||
|
factories.IdentityFactory.create_batch(2, user=user)
|
||||||
|
|
||||||
|
with mock.patch("django.core.mail.send_mail") as mock_send:
|
||||||
|
user.email_user("my subject", "my message")
|
||||||
|
|
||||||
|
mock_send.assert_called_once_with(
|
||||||
|
"my subject", "my message", None, ["dave@example.com"]
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def test_models_users_send_mail_main_missing():
|
||||||
|
"""The "email_user' method should fail if the user has no email address."""
|
||||||
|
user = factories.UserFactory()
|
||||||
|
|
||||||
|
with pytest.raises(models.Identity.DoesNotExist) as excinfo:
|
||||||
|
user.email_user("my subject", "my message")
|
||||||
|
|
||||||
|
assert str(excinfo.value) == "Identity matching query does not exist."
|
||||||
21
src/backend/core/tests/utils.py
Normal file
21
src/backend/core/tests/utils.py
Normal file
@@ -0,0 +1,21 @@
|
|||||||
|
"""Utils for tests in the publish core application"""
|
||||||
|
from rest_framework_simplejwt.tokens import AccessToken
|
||||||
|
|
||||||
|
|
||||||
|
class OIDCToken(AccessToken):
|
||||||
|
"""Set payload on token from user/contact/email"""
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def for_user(cls, user):
|
||||||
|
token = super().for_user(user)
|
||||||
|
identity = user.identities.filter(is_main=True).first()
|
||||||
|
token["first_name"] = (
|
||||||
|
user.profile_contact.short_name if user.profile_contact else "David"
|
||||||
|
)
|
||||||
|
token["last_name"] = (
|
||||||
|
" ".join(user.profile_contact.full_name.split()[1:])
|
||||||
|
if user.profile_contact
|
||||||
|
else "Bowman"
|
||||||
|
)
|
||||||
|
token["email"] = identity.email
|
||||||
|
return token
|
||||||
8
src/backend/core/urls.py
Normal file
8
src/backend/core/urls.py
Normal file
@@ -0,0 +1,8 @@
|
|||||||
|
"""URL configuration for the core app."""
|
||||||
|
from django.urls import path
|
||||||
|
|
||||||
|
from core.views import generate_document
|
||||||
|
|
||||||
|
urlpatterns = [
|
||||||
|
path('generate-document/', generate_document, name='generate_document'),
|
||||||
|
]
|
||||||
27
src/backend/core/views.py
Normal file
27
src/backend/core/views.py
Normal file
@@ -0,0 +1,27 @@
|
|||||||
|
from django.shortcuts import render, HttpResponse
|
||||||
|
from .forms import DocumentGenerationForm
|
||||||
|
from .models import Template
|
||||||
|
|
||||||
|
|
||||||
|
def generate_document(request):
|
||||||
|
if request.method == 'POST':
|
||||||
|
form = DocumentGenerationForm(request.POST)
|
||||||
|
if form.is_valid():
|
||||||
|
# Get the selected template from the form
|
||||||
|
template = form.cleaned_data['template']
|
||||||
|
|
||||||
|
# Get the body content from the form
|
||||||
|
body = form.cleaned_data['body']
|
||||||
|
|
||||||
|
# Call the generate_document method
|
||||||
|
pdf_content = template.generate_document(body)
|
||||||
|
|
||||||
|
# Return the generated PDF as a response for download
|
||||||
|
response = HttpResponse(pdf_content, content_type='application/pdf')
|
||||||
|
response['Content-Disposition'] = f'attachment; filename={template.title}.pdf'
|
||||||
|
return response
|
||||||
|
else:
|
||||||
|
form = DocumentGenerationForm()
|
||||||
|
|
||||||
|
return render(request, 'core/generate_document.html', {'form': form})
|
||||||
|
|
||||||
0
src/backend/demo/__init__.py
Normal file
0
src/backend/demo/__init__.py
Normal file
0
src/backend/demo/management/__init__.py
Normal file
0
src/backend/demo/management/__init__.py
Normal file
0
src/backend/demo/management/commands/__init__.py
Normal file
0
src/backend/demo/management/commands/__init__.py
Normal file
45
src/backend/demo/management/commands/createsuperuser.py
Normal file
45
src/backend/demo/management/commands/createsuperuser.py
Normal file
@@ -0,0 +1,45 @@
|
|||||||
|
from django.contrib.auth import get_user_model
|
||||||
|
from django.core.management.base import BaseCommand
|
||||||
|
from django.core.exceptions import ValidationError
|
||||||
|
|
||||||
|
|
||||||
|
class Command(BaseCommand):
|
||||||
|
help = 'Create a superuser with an email and a password'
|
||||||
|
|
||||||
|
def add_arguments(self, parser):
|
||||||
|
"""Define required arguments "email" and "password"."""
|
||||||
|
parser.add_argument(
|
||||||
|
"--email",
|
||||||
|
help=(
|
||||||
|
"Email for the user."
|
||||||
|
),
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"--password",
|
||||||
|
help='Password for the user.',
|
||||||
|
)
|
||||||
|
|
||||||
|
def handle(self, *args, **options):
|
||||||
|
"""
|
||||||
|
Given an email and a password, create a superuser or upgrade the existing
|
||||||
|
user to superuser status.
|
||||||
|
"""
|
||||||
|
UserModel = get_user_model()
|
||||||
|
email = options.get('email')
|
||||||
|
try:
|
||||||
|
user = UserModel.objects.get(email=email)
|
||||||
|
except UserModel.DoesNotExist:
|
||||||
|
user = UserModel(email=email)
|
||||||
|
message = 'Superuser created successfully.'
|
||||||
|
else:
|
||||||
|
if user.is_superuser and user.is_staff:
|
||||||
|
message = "Superuser already exists."
|
||||||
|
else:
|
||||||
|
message = "User already existed and was upgraded to superuser."
|
||||||
|
|
||||||
|
user.is_superuser = True
|
||||||
|
user.is_staff = True
|
||||||
|
user.set_password(options['password'])
|
||||||
|
user.save()
|
||||||
|
|
||||||
|
self.stdout.write(self.style.SUCCESS(message))
|
||||||
23
src/backend/demo/utils.py
Normal file
23
src/backend/demo/utils.py
Normal file
@@ -0,0 +1,23 @@
|
|||||||
|
from django.contrib.auth.management.commands.createsuperuser import Command as BaseCommand
|
||||||
|
from django.core.exceptions import ValidationError
|
||||||
|
|
||||||
|
class Command(BaseCommand):
|
||||||
|
help = 'Create a superuser without a username field'
|
||||||
|
|
||||||
|
def handle(self, *args, **options):
|
||||||
|
# Check if a superuser already exists
|
||||||
|
try:
|
||||||
|
self.UserModel._default_manager.db_manager(options['database']).get(
|
||||||
|
is_superuser=True,
|
||||||
|
)
|
||||||
|
except self.UserModel.DoesNotExist:
|
||||||
|
# If not, create a superuser without a username
|
||||||
|
email = options.get('email')
|
||||||
|
password = options.get('password')
|
||||||
|
self.UserModel._default_manager.db_manager(options['database']).create_superuser(
|
||||||
|
email=email,
|
||||||
|
password=password,
|
||||||
|
)
|
||||||
|
self.stdout.write(self.style.SUCCESS('Superuser created successfully.'))
|
||||||
|
except ValidationError as e:
|
||||||
|
self.stderr.write(self.style.ERROR(f'Error creating superuser: {", ".join(e.messages)}'))
|
||||||
14
src/backend/manage.py
Normal file
14
src/backend/manage.py
Normal file
@@ -0,0 +1,14 @@
|
|||||||
|
#!/usr/bin/env python
|
||||||
|
"""
|
||||||
|
publish's sandbox management script.
|
||||||
|
"""
|
||||||
|
import os
|
||||||
|
import sys
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "publish.settings")
|
||||||
|
os.environ.setdefault("DJANGO_CONFIGURATION", "Development")
|
||||||
|
|
||||||
|
from configurations.management import execute_from_command_line
|
||||||
|
|
||||||
|
execute_from_command_line(sys.argv)
|
||||||
0
src/backend/publish/__init__.py
Normal file
0
src/backend/publish/__init__.py
Normal file
36
src/backend/publish/api_urls.py
Normal file
36
src/backend/publish/api_urls.py
Normal file
@@ -0,0 +1,36 @@
|
|||||||
|
"""API URL Configuration"""
|
||||||
|
from django.conf import settings
|
||||||
|
from django.urls import include, path, re_path
|
||||||
|
|
||||||
|
from rest_framework.routers import DefaultRouter
|
||||||
|
|
||||||
|
from core.api import viewsets
|
||||||
|
|
||||||
|
# - Main endpoints
|
||||||
|
router = DefaultRouter()
|
||||||
|
router.register("contacts", viewsets.ContactViewSet, basename="contacts")
|
||||||
|
router.register("teams", viewsets.TeamViewSet, basename="teams")
|
||||||
|
router.register("users", viewsets.UserViewSet, basename="users")
|
||||||
|
|
||||||
|
# - Routes nested under a team
|
||||||
|
team_related_router = DefaultRouter()
|
||||||
|
team_related_router.register(
|
||||||
|
"accesses",
|
||||||
|
viewsets.TeamAccessViewSet,
|
||||||
|
basename="team_accesses",
|
||||||
|
)
|
||||||
|
|
||||||
|
urlpatterns = [
|
||||||
|
path(
|
||||||
|
f"api/{settings.API_VERSION}/",
|
||||||
|
include(
|
||||||
|
[
|
||||||
|
*router.urls,
|
||||||
|
re_path(
|
||||||
|
r"^teams/(?P<team_id>[0-9a-z-]*)/",
|
||||||
|
include(team_related_router.urls),
|
||||||
|
),
|
||||||
|
]
|
||||||
|
),
|
||||||
|
)
|
||||||
|
]
|
||||||
22
src/backend/publish/celery_app.py
Normal file
22
src/backend/publish/celery_app.py
Normal file
@@ -0,0 +1,22 @@
|
|||||||
|
"""publish celery configuration file."""
|
||||||
|
import os
|
||||||
|
|
||||||
|
from celery import Celery
|
||||||
|
from configurations.importer import install
|
||||||
|
|
||||||
|
# Set the default Django settings module for the 'celery' program.
|
||||||
|
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "publish.settings")
|
||||||
|
os.environ.setdefault("DJANGO_CONFIGURATION", "Development")
|
||||||
|
|
||||||
|
install(check_options=True)
|
||||||
|
|
||||||
|
app = Celery("publish")
|
||||||
|
|
||||||
|
# Using a string here means the worker doesn't have to serialize
|
||||||
|
# the configuration object to child processes.
|
||||||
|
# - namespace='CELERY' means all celery-related configuration keys
|
||||||
|
# should have a `CELERY_` prefix.
|
||||||
|
app.config_from_object("django.conf:settings", namespace="CELERY")
|
||||||
|
|
||||||
|
# Load task modules from all registered Django apps.
|
||||||
|
app.autodiscover_tasks()
|
||||||
510
src/backend/publish/settings.py
Executable file
510
src/backend/publish/settings.py
Executable file
@@ -0,0 +1,510 @@
|
|||||||
|
"""
|
||||||
|
Django settings for publish project.
|
||||||
|
|
||||||
|
Generated by 'django-admin startproject' using Django 3.1.5.
|
||||||
|
|
||||||
|
For more information on this file, see
|
||||||
|
https://docs.djangoproject.com/en/3.1/topics/settings/
|
||||||
|
|
||||||
|
For the full list of settings and their values, see
|
||||||
|
https://docs.djangoproject.com/en/3.1/ref/settings/
|
||||||
|
"""
|
||||||
|
import json
|
||||||
|
import os
|
||||||
|
|
||||||
|
from django.utils.translation import gettext_lazy as _
|
||||||
|
|
||||||
|
import sentry_sdk
|
||||||
|
from configurations import Configuration, values
|
||||||
|
from sentry_sdk.integrations.django import DjangoIntegration
|
||||||
|
|
||||||
|
# Build paths inside the project like this: BASE_DIR / 'subdir'.
|
||||||
|
BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
|
||||||
|
DATA_DIR = os.path.join("/", "data")
|
||||||
|
|
||||||
|
|
||||||
|
def get_release():
|
||||||
|
"""
|
||||||
|
Get the current release of the application
|
||||||
|
|
||||||
|
By release, we mean the release from the version.json file à la Mozilla [1]
|
||||||
|
(if any). If this file has not been found, it defaults to "NA".
|
||||||
|
|
||||||
|
[1]
|
||||||
|
https://github.com/mozilla-services/Dockerflow/blob/master/docs/version_object.md
|
||||||
|
"""
|
||||||
|
# Try to get the current release from the version.json file generated by the
|
||||||
|
# CI during the Docker image build
|
||||||
|
try:
|
||||||
|
with open(os.path.join(BASE_DIR, "version.json"), encoding="utf8") as version:
|
||||||
|
return json.load(version)["version"]
|
||||||
|
except FileNotFoundError:
|
||||||
|
return "NA" # Default: not available
|
||||||
|
|
||||||
|
|
||||||
|
class Base(Configuration):
|
||||||
|
"""
|
||||||
|
This is the base configuration every configuration (aka environnement) should inherit from. It
|
||||||
|
is recommended to configure third-party applications by creating a configuration mixins in
|
||||||
|
./configurations and compose the Base configuration with those mixins.
|
||||||
|
|
||||||
|
It depends on an environment variable that SHOULD be defined:
|
||||||
|
|
||||||
|
* DJANGO_SECRET_KEY
|
||||||
|
|
||||||
|
You may also want to override default configuration by setting the following environment
|
||||||
|
variables:
|
||||||
|
|
||||||
|
* DJANGO_SENTRY_DSN
|
||||||
|
* DB_NAME
|
||||||
|
* DB_HOST
|
||||||
|
* DB_PASSWORD
|
||||||
|
* DB_USER
|
||||||
|
"""
|
||||||
|
|
||||||
|
DEBUG = False
|
||||||
|
USE_SWAGGER = False
|
||||||
|
|
||||||
|
API_VERSION = "v1.0"
|
||||||
|
|
||||||
|
# Security
|
||||||
|
ALLOWED_HOSTS = values.ListValue([])
|
||||||
|
SECRET_KEY = values.Value(None)
|
||||||
|
|
||||||
|
# Application definition
|
||||||
|
ROOT_URLCONF = "publish.urls"
|
||||||
|
WSGI_APPLICATION = "publish.wsgi.application"
|
||||||
|
|
||||||
|
# Database
|
||||||
|
DATABASES = {
|
||||||
|
"default": {
|
||||||
|
"ENGINE": values.Value(
|
||||||
|
"django.db.backends.postgresql_psycopg2",
|
||||||
|
environ_name="DB_ENGINE",
|
||||||
|
environ_prefix=None,
|
||||||
|
),
|
||||||
|
"NAME": values.Value("publish", environ_name="DB_NAME", environ_prefix=None),
|
||||||
|
"USER": values.Value("dinum", environ_name="DB_USER", environ_prefix=None),
|
||||||
|
"PASSWORD": values.Value(
|
||||||
|
"pass", environ_name="DB_PASSWORD", environ_prefix=None
|
||||||
|
),
|
||||||
|
"HOST": values.Value(
|
||||||
|
"localhost", environ_name="DB_HOST", environ_prefix=None
|
||||||
|
),
|
||||||
|
"PORT": values.Value(5432, environ_name="DB_PORT", environ_prefix=None),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
DEFAULT_AUTO_FIELD = "django.db.models.AutoField"
|
||||||
|
|
||||||
|
# Static files (CSS, JavaScript, Images)
|
||||||
|
STATIC_URL = "/static/"
|
||||||
|
STATIC_ROOT = os.path.join(DATA_DIR, "static")
|
||||||
|
MEDIA_URL = "/media/"
|
||||||
|
MEDIA_ROOT = os.path.join(DATA_DIR, "media")
|
||||||
|
|
||||||
|
SITE_ID = 1
|
||||||
|
|
||||||
|
STORAGES = {
|
||||||
|
"default": {
|
||||||
|
"BACKEND": "django.core.files.storage.FileSystemStorage",
|
||||||
|
},
|
||||||
|
"staticfiles": {
|
||||||
|
"BACKEND": "django.contrib.staticfiles.storage.StaticFilesStorage",
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
# Internationalization
|
||||||
|
# https://docs.djangoproject.com/en/3.1/topics/i18n/
|
||||||
|
|
||||||
|
# Languages
|
||||||
|
LANGUAGE_CODE = values.Value("en-us")
|
||||||
|
|
||||||
|
DRF_NESTED_MULTIPART_PARSER = {
|
||||||
|
# output of parser is converted to querydict
|
||||||
|
# if is set to False, dict python is returned
|
||||||
|
"querydict": False,
|
||||||
|
}
|
||||||
|
|
||||||
|
# Careful! Languages should be ordered by priority, as this tuple is used to get
|
||||||
|
# fallback/default languages throughout the app.
|
||||||
|
LANGUAGES = values.SingleNestedTupleValue(
|
||||||
|
(
|
||||||
|
("en-us", _("English")),
|
||||||
|
("fr-fr", _("French")),
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
LOCALE_PATHS = (os.path.join(BASE_DIR, "locale"),)
|
||||||
|
|
||||||
|
TIME_ZONE = "UTC"
|
||||||
|
USE_I18N = True
|
||||||
|
USE_TZ = True
|
||||||
|
|
||||||
|
# Templates
|
||||||
|
TEMPLATES = [
|
||||||
|
{
|
||||||
|
"BACKEND": "django.template.backends.django.DjangoTemplates",
|
||||||
|
"DIRS": [os.path.join(BASE_DIR, "templates")],
|
||||||
|
"OPTIONS": {
|
||||||
|
"context_processors": [
|
||||||
|
"django.contrib.auth.context_processors.auth",
|
||||||
|
"django.contrib.messages.context_processors.messages",
|
||||||
|
"django.template.context_processors.csrf",
|
||||||
|
"django.template.context_processors.debug",
|
||||||
|
"django.template.context_processors.i18n",
|
||||||
|
"django.template.context_processors.media",
|
||||||
|
"django.template.context_processors.request",
|
||||||
|
"django.template.context_processors.tz",
|
||||||
|
],
|
||||||
|
"loaders": [
|
||||||
|
"django.template.loaders.filesystem.Loader",
|
||||||
|
"django.template.loaders.app_directories.Loader",
|
||||||
|
],
|
||||||
|
},
|
||||||
|
},
|
||||||
|
]
|
||||||
|
|
||||||
|
MIDDLEWARE = [
|
||||||
|
"django.middleware.security.SecurityMiddleware",
|
||||||
|
"whitenoise.middleware.WhiteNoiseMiddleware",
|
||||||
|
"django.contrib.sessions.middleware.SessionMiddleware",
|
||||||
|
"django.middleware.locale.LocaleMiddleware",
|
||||||
|
"django.middleware.clickjacking.XFrameOptionsMiddleware",
|
||||||
|
"corsheaders.middleware.CorsMiddleware",
|
||||||
|
"django.middleware.common.CommonMiddleware",
|
||||||
|
"django.middleware.csrf.CsrfViewMiddleware",
|
||||||
|
"django.contrib.auth.middleware.AuthenticationMiddleware",
|
||||||
|
"django.contrib.messages.middleware.MessageMiddleware",
|
||||||
|
]
|
||||||
|
|
||||||
|
AUTHENTICATION_BACKENDS = [
|
||||||
|
"django.contrib.auth.backends.ModelBackend",
|
||||||
|
]
|
||||||
|
|
||||||
|
# Django applications from the highest priority to the lowest
|
||||||
|
INSTALLED_APPS = [
|
||||||
|
# publish
|
||||||
|
"core",
|
||||||
|
"demo",
|
||||||
|
"drf_spectacular",
|
||||||
|
# Third party apps
|
||||||
|
"corsheaders",
|
||||||
|
"dockerflow.django",
|
||||||
|
"rest_framework",
|
||||||
|
"parler",
|
||||||
|
"easy_thumbnails",
|
||||||
|
# Django
|
||||||
|
"django.contrib.admin",
|
||||||
|
"django.contrib.auth",
|
||||||
|
"django.contrib.contenttypes",
|
||||||
|
"django.contrib.postgres",
|
||||||
|
"django.contrib.sessions",
|
||||||
|
"django.contrib.sites",
|
||||||
|
"django.contrib.messages",
|
||||||
|
"django.contrib.staticfiles",
|
||||||
|
]
|
||||||
|
|
||||||
|
# Cache
|
||||||
|
CACHES = {
|
||||||
|
"default": {"BACKEND": "django.core.cache.backends.locmem.LocMemCache"},
|
||||||
|
}
|
||||||
|
|
||||||
|
REST_FRAMEWORK = {
|
||||||
|
"DEFAULT_AUTHENTICATION_CLASSES": (
|
||||||
|
"core.authentication.DelegatedJWTAuthentication",
|
||||||
|
),
|
||||||
|
"DEFAULT_PARSER_CLASSES": [
|
||||||
|
"rest_framework.parsers.JSONParser",
|
||||||
|
"nested_multipart_parser.drf.DrfNestedParser",
|
||||||
|
],
|
||||||
|
"EXCEPTION_HANDLER": "core.api.exception_handler",
|
||||||
|
"DEFAULT_PAGINATION_CLASS": "rest_framework.pagination.PageNumberPagination",
|
||||||
|
"PAGE_SIZE": 20,
|
||||||
|
"DEFAULT_VERSIONING_CLASS": "rest_framework.versioning.URLPathVersioning",
|
||||||
|
"DEFAULT_SCHEMA_CLASS": "drf_spectacular.openapi.AutoSchema",
|
||||||
|
}
|
||||||
|
|
||||||
|
SPECTACULAR_SETTINGS = {
|
||||||
|
"TITLE": "publish API",
|
||||||
|
"DESCRIPTION": "This is the publish API schema.",
|
||||||
|
"VERSION": "1.0.0",
|
||||||
|
"SERVE_INCLUDE_SCHEMA": False,
|
||||||
|
"ENABLE_DJANGO_DEPLOY_CHECK": values.BooleanValue(
|
||||||
|
default=False,
|
||||||
|
environ_name="SPECTACULAR_SETTINGS_ENABLE_DJANGO_DEPLOY_CHECK",
|
||||||
|
),
|
||||||
|
"COMPONENT_SPLIT_REQUEST": True,
|
||||||
|
# OTHER SETTINGS
|
||||||
|
"SWAGGER_UI_DIST": "SIDECAR", # shorthand to use the sidecar instead
|
||||||
|
"SWAGGER_UI_FAVICON_HREF": "SIDECAR",
|
||||||
|
"REDOC_DIST": "SIDECAR",
|
||||||
|
}
|
||||||
|
|
||||||
|
SIMPLE_JWT = {
|
||||||
|
"ALGORITHM": values.Value("HS256", environ_name="JWT_ALGORITHM"),
|
||||||
|
"SIGNING_KEY": values.SecretValue(
|
||||||
|
environ_name="JWT_PRIVATE_SIGNING_KEY",
|
||||||
|
),
|
||||||
|
"AUTH_HEADER_TYPES": ("Bearer",),
|
||||||
|
"AUTH_HEADER_NAME": "HTTP_AUTHORIZATION",
|
||||||
|
"USER_ID_FIELD": "sub",
|
||||||
|
"USER_ID_CLAIM": "sub",
|
||||||
|
"AUTH_TOKEN_CLASSES": ("rest_framework_simplejwt.tokens.AccessToken",),
|
||||||
|
}
|
||||||
|
JWT_USER_GETTER = values.Value(
|
||||||
|
"core.models.oidc_user_getter",
|
||||||
|
environ_name="publish_JWT_USER_GETTER",
|
||||||
|
environ_prefix=None,
|
||||||
|
)
|
||||||
|
|
||||||
|
# Mail
|
||||||
|
EMAIL_BACKEND = values.Value("django.core.mail.backends.smtp.EmailBackend")
|
||||||
|
EMAIL_HOST = values.Value(None)
|
||||||
|
EMAIL_HOST_USER = values.Value(None)
|
||||||
|
EMAIL_HOST_PASSWORD = values.Value(None)
|
||||||
|
EMAIL_PORT = values.PositiveIntegerValue(None)
|
||||||
|
EMAIL_USE_TLS = values.BooleanValue(False)
|
||||||
|
EMAIL_FROM = values.Value("from@example.com")
|
||||||
|
|
||||||
|
AUTH_USER_MODEL = "core.User"
|
||||||
|
|
||||||
|
# CORS
|
||||||
|
CORS_ALLOW_CREDENTIALS = True
|
||||||
|
CORS_ALLOW_ALL_ORIGINS = values.BooleanValue(False)
|
||||||
|
CORS_ALLOWED_ORIGINS = values.ListValue([])
|
||||||
|
CORS_ALLOWED_ORIGIN_REGEXES = values.ListValue([])
|
||||||
|
|
||||||
|
# Sentry
|
||||||
|
SENTRY_DSN = values.Value(None, environ_name="SENTRY_DSN")
|
||||||
|
|
||||||
|
# Easy thumbnails
|
||||||
|
THUMBNAIL_EXTENSION = "webp"
|
||||||
|
THUMBNAIL_TRANSPARENCY_EXTENSION = "webp"
|
||||||
|
THUMBNAIL_ALIASES = {}
|
||||||
|
|
||||||
|
# Celery
|
||||||
|
CELERY_BROKER_URL = values.Value("redis://redis:6379/0")
|
||||||
|
CELERY_BROKER_TRANSPORT_OPTIONS = values.DictValue({})
|
||||||
|
|
||||||
|
# pylint: disable=invalid-name
|
||||||
|
@property
|
||||||
|
def ENVIRONMENT(self):
|
||||||
|
"""Environment in which the application is launched."""
|
||||||
|
return self.__class__.__name__.lower()
|
||||||
|
|
||||||
|
# pylint: disable=invalid-name
|
||||||
|
@property
|
||||||
|
def RELEASE(self):
|
||||||
|
"""
|
||||||
|
Return the release information.
|
||||||
|
|
||||||
|
Delegate to the module function to enable easier testing.
|
||||||
|
"""
|
||||||
|
return get_release()
|
||||||
|
|
||||||
|
# pylint: disable=invalid-name
|
||||||
|
@property
|
||||||
|
def PARLER_LANGUAGES(self):
|
||||||
|
"""
|
||||||
|
Return languages for Parler computed from the LANGUAGES and LANGUAGE_CODE settings.
|
||||||
|
"""
|
||||||
|
return {
|
||||||
|
self.SITE_ID: tuple({"code": code} for code, _name in self.LANGUAGES),
|
||||||
|
"default": {
|
||||||
|
"fallbacks": [self.LANGUAGE_CODE],
|
||||||
|
"hide_untranslated": False,
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def post_setup(cls):
|
||||||
|
"""Post setup configuration.
|
||||||
|
This is the place where you can configure settings that require other
|
||||||
|
settings to be loaded.
|
||||||
|
"""
|
||||||
|
super().post_setup()
|
||||||
|
|
||||||
|
# The SENTRY_DSN setting should be available to activate sentry for an environment
|
||||||
|
if cls.SENTRY_DSN is not None:
|
||||||
|
sentry_sdk.init(
|
||||||
|
dsn=cls.SENTRY_DSN,
|
||||||
|
environment=cls.__name__.lower(),
|
||||||
|
release=get_release(),
|
||||||
|
integrations=[DjangoIntegration()],
|
||||||
|
)
|
||||||
|
with sentry_sdk.configure_scope() as scope:
|
||||||
|
scope.set_extra("application", "backend")
|
||||||
|
|
||||||
|
|
||||||
|
class Build(Base):
|
||||||
|
"""Settings used when the application is built.
|
||||||
|
|
||||||
|
This environment should not be used to run the application. Just to build it with non-blocking
|
||||||
|
settings.
|
||||||
|
"""
|
||||||
|
|
||||||
|
SECRET_KEY = values.Value("DummyKey")
|
||||||
|
STORAGES = {
|
||||||
|
"default": {
|
||||||
|
"BACKEND": "django.core.files.storage.FileSystemStorage",
|
||||||
|
},
|
||||||
|
"staticfiles": {
|
||||||
|
"BACKEND": values.Value(
|
||||||
|
"whitenoise.storage.CompressedManifestStaticFilesStorage",
|
||||||
|
environ_name="STORAGES_STATICFILES_BACKEND",
|
||||||
|
),
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
class Development(Base):
|
||||||
|
"""
|
||||||
|
Development environment settings
|
||||||
|
|
||||||
|
We set DEBUG to True and configure the server to respond from all hosts.
|
||||||
|
"""
|
||||||
|
|
||||||
|
ALLOWED_HOSTS = ["*"]
|
||||||
|
CORS_ALLOW_ALL_ORIGINS = True
|
||||||
|
CSRF_TRUSTED_ORIGINS = ["http://localhost:8072"]
|
||||||
|
DEBUG = True
|
||||||
|
|
||||||
|
SESSION_COOKIE_NAME = "publish_sessionid"
|
||||||
|
|
||||||
|
USE_SWAGGER = True
|
||||||
|
|
||||||
|
def __init__(self):
|
||||||
|
# pylint: disable=invalid-name
|
||||||
|
self.INSTALLED_APPS += ["django_extensions", "drf_spectacular_sidecar"]
|
||||||
|
|
||||||
|
|
||||||
|
class Test(Base):
|
||||||
|
"""Test environment settings"""
|
||||||
|
|
||||||
|
LOGGING = values.DictValue(
|
||||||
|
{
|
||||||
|
"version": 1,
|
||||||
|
"disable_existing_loggers": False,
|
||||||
|
"handlers": {
|
||||||
|
"console": {
|
||||||
|
"class": "logging.StreamHandler",
|
||||||
|
},
|
||||||
|
},
|
||||||
|
"loggers": {
|
||||||
|
"publish": {
|
||||||
|
"handlers": ["console"],
|
||||||
|
"level": "DEBUG",
|
||||||
|
},
|
||||||
|
},
|
||||||
|
}
|
||||||
|
)
|
||||||
|
PASSWORD_HASHERS = [
|
||||||
|
"django.contrib.auth.hashers.MD5PasswordHasher",
|
||||||
|
]
|
||||||
|
USE_SWAGGER = True
|
||||||
|
|
||||||
|
STORAGES = {
|
||||||
|
"default": {
|
||||||
|
"BACKEND": "django.core.files.storage.FileSystemStorage",
|
||||||
|
},
|
||||||
|
"staticfiles": {
|
||||||
|
"BACKEND": "django.contrib.staticfiles.storage.StaticFilesStorage",
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
CELERY_TASK_ALWAYS_EAGER = values.BooleanValue(True)
|
||||||
|
|
||||||
|
def __init__(self):
|
||||||
|
# pylint: disable=invalid-name
|
||||||
|
self.INSTALLED_APPS += ["drf_spectacular_sidecar"]
|
||||||
|
|
||||||
|
|
||||||
|
class ContinuousIntegration(Test):
|
||||||
|
"""
|
||||||
|
Continuous Integration environment settings
|
||||||
|
|
||||||
|
nota bene: it should inherit from the Test environment.
|
||||||
|
"""
|
||||||
|
|
||||||
|
|
||||||
|
class Production(Base):
|
||||||
|
"""
|
||||||
|
Production environment settings
|
||||||
|
|
||||||
|
You must define the ALLOWED_HOSTS environment variable in Production
|
||||||
|
configuration (and derived configurations):
|
||||||
|
ALLOWED_HOSTS=["foo.com", "foo.fr"]
|
||||||
|
"""
|
||||||
|
|
||||||
|
# Security
|
||||||
|
ALLOWED_HOSTS = values.ListValue(None)
|
||||||
|
CSRF_TRUSTED_ORIGINS = values.ListValue([])
|
||||||
|
SECURE_BROWSER_XSS_FILTER = True
|
||||||
|
SECURE_CONTENT_TYPE_NOSNIFF = True
|
||||||
|
|
||||||
|
# SECURE_PROXY_SSL_HEADER allows to fix the scheme in Django's HttpRequest
|
||||||
|
# object when your application is behind a reverse proxy.
|
||||||
|
#
|
||||||
|
# Keep this SECURE_PROXY_SSL_HEADER configuration only if :
|
||||||
|
# - your Django app is behind a proxy.
|
||||||
|
# - your proxy strips the X-Forwarded-Proto header from all incoming requests
|
||||||
|
# - Your proxy sets the X-Forwarded-Proto header and sends it to Django
|
||||||
|
#
|
||||||
|
# In other cases, you should comment the following line to avoid security issues.
|
||||||
|
# SECURE_PROXY_SSL_HEADER = ("HTTP_X_FORWARDED_PROTO", "https")
|
||||||
|
|
||||||
|
# Modern browsers require to have the `secure` attribute on cookies with `Samesite=none`
|
||||||
|
CSRF_COOKIE_SECURE = True
|
||||||
|
SESSION_COOKIE_SECURE = True
|
||||||
|
|
||||||
|
# For static files in production, we want to use a backend that includes a hash in
|
||||||
|
# the filename, that is calculated from the file content, so that browsers always
|
||||||
|
# get the updated version of each file.
|
||||||
|
STORAGES = {
|
||||||
|
"default": {
|
||||||
|
"BACKEND": "storages.backends.s3.S3Storage",
|
||||||
|
},
|
||||||
|
"staticfiles": {
|
||||||
|
# For static files in production, we want to use a backend that includes a hash in
|
||||||
|
# the filename, that is calculated from the file content, so that browsers always
|
||||||
|
# get the updated version of each file.
|
||||||
|
"BACKEND": values.Value(
|
||||||
|
"whitenoise.storage.CompressedManifestStaticFilesStorage",
|
||||||
|
environ_name="STORAGES_STATICFILES_BACKEND",
|
||||||
|
)
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
# Privacy
|
||||||
|
SECURE_REFERRER_POLICY = "same-origin"
|
||||||
|
|
||||||
|
# Media
|
||||||
|
AWS_S3_ENDPOINT_URL = values.Value()
|
||||||
|
AWS_S3_ACCESS_KEY_ID = values.Value()
|
||||||
|
AWS_S3_SECRET_ACCESS_KEY = values.Value()
|
||||||
|
AWS_STORAGE_BUCKET_NAME = values.Value("tf-default-publish-media-storage")
|
||||||
|
AWS_S3_REGION_NAME = values.Value()
|
||||||
|
|
||||||
|
|
||||||
|
class Feature(Production):
|
||||||
|
"""
|
||||||
|
Feature environment settings
|
||||||
|
|
||||||
|
nota bene: it should inherit from the Production environment.
|
||||||
|
"""
|
||||||
|
|
||||||
|
|
||||||
|
class Staging(Production):
|
||||||
|
"""
|
||||||
|
Staging environment settings
|
||||||
|
|
||||||
|
nota bene: it should inherit from the Production environment.
|
||||||
|
"""
|
||||||
|
|
||||||
|
|
||||||
|
class PreProduction(Production):
|
||||||
|
"""
|
||||||
|
Pre-production environment settings
|
||||||
|
|
||||||
|
nota bene: it should inherit from the Production environment.
|
||||||
|
"""
|
||||||
20
src/backend/publish/urls.py
Normal file
20
src/backend/publish/urls.py
Normal file
@@ -0,0 +1,20 @@
|
|||||||
|
"""URL configuration for the publish project"""
|
||||||
|
|
||||||
|
from django.conf import settings
|
||||||
|
from django.conf.urls.static import static
|
||||||
|
from django.contrib import admin
|
||||||
|
from django.contrib.staticfiles.urls import staticfiles_urlpatterns
|
||||||
|
from django.urls import include, path, re_path
|
||||||
|
|
||||||
|
|
||||||
|
urlpatterns = [
|
||||||
|
path("admin/", admin.site.urls),
|
||||||
|
path("", include('core.urls')),
|
||||||
|
]
|
||||||
|
|
||||||
|
if settings.DEBUG:
|
||||||
|
urlpatterns = (
|
||||||
|
urlpatterns
|
||||||
|
+ staticfiles_urlpatterns()
|
||||||
|
+ static(settings.MEDIA_URL, document_root=settings.MEDIA_ROOT)
|
||||||
|
)
|
||||||
17
src/backend/publish/wsgi.py
Normal file
17
src/backend/publish/wsgi.py
Normal file
@@ -0,0 +1,17 @@
|
|||||||
|
"""
|
||||||
|
WSGI config for the publish project.
|
||||||
|
|
||||||
|
It exposes the WSGI callable as a module-level variable named ``application``.
|
||||||
|
|
||||||
|
For more information on this file, see
|
||||||
|
https://docs.djangoproject.com/en/3.1/howto/deployment/wsgi/
|
||||||
|
"""
|
||||||
|
|
||||||
|
import os
|
||||||
|
|
||||||
|
from configurations.wsgi import get_wsgi_application
|
||||||
|
|
||||||
|
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "publish.settings")
|
||||||
|
os.environ.setdefault("DJANGO_CONFIGURATION", "Development")
|
||||||
|
|
||||||
|
application = get_wsgi_application()
|
||||||
138
src/backend/pyproject.toml
Normal file
138
src/backend/pyproject.toml
Normal file
@@ -0,0 +1,138 @@
|
|||||||
|
#
|
||||||
|
# publish package
|
||||||
|
#
|
||||||
|
[build-system]
|
||||||
|
requires = ["setuptools"]
|
||||||
|
build-backend = "setuptools.build_meta"
|
||||||
|
|
||||||
|
[project]
|
||||||
|
name = "publish"
|
||||||
|
version = "0.1.0"
|
||||||
|
authors = [{ "name" = "DINUM", "email" = "dev@mail.numerique.gouv.fr" }]
|
||||||
|
classifiers = [
|
||||||
|
"Development Status :: 5 - Production/Stable",
|
||||||
|
"Framework :: Django",
|
||||||
|
"Framework :: Django :: 5",
|
||||||
|
"Intended Audience :: Developers",
|
||||||
|
"License :: OSI Approved :: MIT License",
|
||||||
|
"Natural Language :: English",
|
||||||
|
"Programming Language :: Python :: 3",
|
||||||
|
"Programming Language :: Python :: 3.10",
|
||||||
|
]
|
||||||
|
description = "An application to print markdown to pdf from a set of managed templates."
|
||||||
|
keywords = ["Django", "Contacts", "Teams", "RBAC"]
|
||||||
|
license = { file = "LICENSE" }
|
||||||
|
readme = "README.md"
|
||||||
|
requires-python = ">=3.10"
|
||||||
|
dependencies = [
|
||||||
|
"boto3==1.33.6",
|
||||||
|
"Brotli==1.1.0",
|
||||||
|
"celery[redis]==5.3.6",
|
||||||
|
"django-configurations==2.5",
|
||||||
|
"django-cors-headers==4.3.1",
|
||||||
|
"django-countries==7.5.1",
|
||||||
|
"django-parler==2.3",
|
||||||
|
"django-storages==1.14.2",
|
||||||
|
"django-timezone-field>=5.1",
|
||||||
|
"django==5.0",
|
||||||
|
"djangorestframework-simplejwt==5.3.0",
|
||||||
|
"djangorestframework==3.14.0",
|
||||||
|
"drf_spectacular==0.26.5",
|
||||||
|
"dockerflow==2022.8.0",
|
||||||
|
"easy_thumbnails==2.8.5",
|
||||||
|
"factory_boy==3.3.0",
|
||||||
|
"gunicorn==21.2.0",
|
||||||
|
"jsonschema==4.20.0",
|
||||||
|
"markdown==3.5.1",
|
||||||
|
"nested-multipart-parser==1.5.0",
|
||||||
|
"psycopg[binary]==3.1.14",
|
||||||
|
"PyJWT==2.8.0",
|
||||||
|
"requests==2.31.0",
|
||||||
|
"sentry-sdk==1.38.0",
|
||||||
|
"url-normalize==1.4.3",
|
||||||
|
"WeasyPrint>=60.2",
|
||||||
|
"whitenoise==6.6.0",
|
||||||
|
]
|
||||||
|
|
||||||
|
[project.urls]
|
||||||
|
"Bug Tracker" = "https://github.com/numerique-gouv/publish/issues/new"
|
||||||
|
"Changelog" = "https://github.com/numerique-gouv/publish/blob/main/CHANGELOG.md"
|
||||||
|
"Homepage" = "https://github.com/numerique-gouv/publish"
|
||||||
|
"Repository" = "https://github.com/numerique-gouv/publish"
|
||||||
|
|
||||||
|
[project.optional-dependencies]
|
||||||
|
dev = [
|
||||||
|
"django-extensions==3.2.3",
|
||||||
|
"drf-spectacular-sidecar==2023.12.1",
|
||||||
|
"ipdb==0.13.13",
|
||||||
|
"ipython==8.18.1",
|
||||||
|
"pyfakefs==5.3.2",
|
||||||
|
"pylint-django==2.5.5",
|
||||||
|
"pylint==3.0.3",
|
||||||
|
"pytest-cov==4.1.0",
|
||||||
|
"pytest-django==4.7.0",
|
||||||
|
"pytest==7.4.3",
|
||||||
|
"pytest-icdiff==0.8",
|
||||||
|
"pytest-xdist==3.5.0",
|
||||||
|
"responses==0.24.1",
|
||||||
|
"ruff==0.1.6",
|
||||||
|
"types-requests==2.31.0.10",
|
||||||
|
]
|
||||||
|
|
||||||
|
[tool.setuptools]
|
||||||
|
packages = { find = { where = ["."], exclude = ["tests"] } }
|
||||||
|
zip-safe = true
|
||||||
|
|
||||||
|
[tool.distutils.bdist_wheel]
|
||||||
|
universal = true
|
||||||
|
|
||||||
|
[tool.ruff]
|
||||||
|
exclude = [
|
||||||
|
".git",
|
||||||
|
".venv",
|
||||||
|
"build",
|
||||||
|
"venv",
|
||||||
|
"__pycache__",
|
||||||
|
"*/migrations/*",
|
||||||
|
]
|
||||||
|
ignore= ["DJ001", "PLR2004"]
|
||||||
|
line-length = 88
|
||||||
|
|
||||||
|
|
||||||
|
[tool.ruff.lint]
|
||||||
|
select = [
|
||||||
|
"B", # flake8-bugbear
|
||||||
|
"BLE", # flake8-blind-except
|
||||||
|
"C4", # flake8-comprehensions
|
||||||
|
"DJ", # flake8-django
|
||||||
|
"I", # isort
|
||||||
|
"PLC", # pylint-convention
|
||||||
|
"PLE", # pylint-error
|
||||||
|
"PLR", # pylint-refactoring
|
||||||
|
"PLW", # pylint-warning
|
||||||
|
"RUF100", # Ruff unused-noqa
|
||||||
|
"RUF200", # Ruff check pyproject.toml
|
||||||
|
"S", # flake8-bandit
|
||||||
|
"SLF", # flake8-self
|
||||||
|
"T20", # flake8-print
|
||||||
|
]
|
||||||
|
|
||||||
|
[tool.ruff.lint.isort]
|
||||||
|
section-order = ["future","standard-library","django","third-party","publish","first-party","local-folder"]
|
||||||
|
sections = { publish=["core"], django=["django"] }
|
||||||
|
|
||||||
|
[tool.ruff.per-file-ignores]
|
||||||
|
"**/tests/*" = ["S", "SLF"]
|
||||||
|
|
||||||
|
[tool.pytest.ini_options]
|
||||||
|
addopts = [
|
||||||
|
"-v",
|
||||||
|
"--cov-report",
|
||||||
|
"term-missing",
|
||||||
|
# Allow test files to have the same name in different directories.
|
||||||
|
"--import-mode=importlib",
|
||||||
|
]
|
||||||
|
python_files = [
|
||||||
|
"test_*.py",
|
||||||
|
"tests.py",
|
||||||
|
]
|
||||||
7
src/backend/setup.py
Normal file
7
src/backend/setup.py
Normal file
@@ -0,0 +1,7 @@
|
|||||||
|
#!/usr/bin/env python
|
||||||
|
"""Setup file for the publish module. All configuration stands in the setup.cfg file."""
|
||||||
|
# coding: utf-8
|
||||||
|
|
||||||
|
from setuptools import setup
|
||||||
|
|
||||||
|
setup()
|
||||||
22
src/mail/bin/html-to-plain-text
Executable file
22
src/mail/bin/html-to-plain-text
Executable file
@@ -0,0 +1,22 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
set -eo pipefail
|
||||||
|
# Run html-to-text to convert all html files to text files
|
||||||
|
DIR_MAILS="../backend/core/templates/mail/"
|
||||||
|
|
||||||
|
if [ ! -d "${DIR_MAILS}" ]; then
|
||||||
|
mkdir -p "${DIR_MAILS}";
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [ ! -d "${DIR_MAILS}"html/ ]; then
|
||||||
|
mkdir -p "${DIR_MAILS}"html/;
|
||||||
|
exit;
|
||||||
|
fi
|
||||||
|
|
||||||
|
for file in "${DIR_MAILS}"html/*.html;
|
||||||
|
do html-to-text -j ./html-to-text.config.json < "$file" > "${file%.html}".txt; done;
|
||||||
|
|
||||||
|
if [ ! -d "${DIR_MAILS}"text/ ]; then
|
||||||
|
mkdir -p "${DIR_MAILS}"text/;
|
||||||
|
fi
|
||||||
|
|
||||||
|
mv "${DIR_MAILS}"html/*.txt "${DIR_MAILS}"text/;
|
||||||
9
src/mail/bin/mjml-to-html
Executable file
9
src/mail/bin/mjml-to-html
Executable file
@@ -0,0 +1,9 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
|
||||||
|
# Run mjml command to convert all mjml templates to html files
|
||||||
|
DIR_MAILS="../backend/core/templates/mail/html/"
|
||||||
|
|
||||||
|
if [ ! -d "${DIR_MAILS}" ]; then
|
||||||
|
mkdir -p "${DIR_MAILS}";
|
||||||
|
fi
|
||||||
|
mjml mjml/*.mjml -o "${DIR_MAILS}";
|
||||||
11
src/mail/html-to-text.config.json
Normal file
11
src/mail/html-to-text.config.json
Normal file
@@ -0,0 +1,11 @@
|
|||||||
|
{
|
||||||
|
"wordwrap": 600,
|
||||||
|
"selectors": [
|
||||||
|
{
|
||||||
|
"selector": "h1",
|
||||||
|
"options": {
|
||||||
|
"uppercase": false
|
||||||
|
}
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
28
src/mail/mjml/hello.mjml
Normal file
28
src/mail/mjml/hello.mjml
Normal file
@@ -0,0 +1,28 @@
|
|||||||
|
<mjml>
|
||||||
|
<mj-include path="./partial/header.mjml" />
|
||||||
|
<mj-body mj-class="bg--blue-100">
|
||||||
|
<mj-wrapper css-class="wrapper" padding="20px 40px 40px 40px">
|
||||||
|
<mj-section>
|
||||||
|
<mj-column>
|
||||||
|
<mj-image src="{% base64_static 'publish/images/logo_publish.png' %}" width="200px" align="left" alt="{%trans 'Company logo' %}" />
|
||||||
|
</mj-column>
|
||||||
|
</mj-section>
|
||||||
|
<mj-section mj-class="bg--blue-100" border-radius="6px 6px 0 0" padding="30px 50px 60px 50px">
|
||||||
|
<mj-column>
|
||||||
|
<mj-text padding="0">
|
||||||
|
<p>
|
||||||
|
{%if fullname%}
|
||||||
|
{% blocktranslate with name=fullname %}Hello {{ name }}{% endblocktranslate %}
|
||||||
|
{% else %}
|
||||||
|
{%trans "Hello" %}
|
||||||
|
{% endif %}<br/>
|
||||||
|
<strong>{%trans "Thank you very much for your visit!"%}</strong>
|
||||||
|
</p>
|
||||||
|
</mj-text>
|
||||||
|
</mj-column>
|
||||||
|
</mj-section>
|
||||||
|
</mj-wrapper>
|
||||||
|
<mj-include path="./partial/footer.mjml" />
|
||||||
|
</mj-body>
|
||||||
|
</mjml>
|
||||||
|
|
||||||
9
src/mail/mjml/partial/footer.mjml
Normal file
9
src/mail/mjml/partial/footer.mjml
Normal file
@@ -0,0 +1,9 @@
|
|||||||
|
<mj-section padding="0">
|
||||||
|
<mj-column>
|
||||||
|
<mj-text mj-class="text--small" align="center" padding="20px 20px">
|
||||||
|
{% blocktranslate with href=site.url name=site.name trimmed %}
|
||||||
|
This mail has been sent to {{email}} by <a href="{{href}}">{{name}}</a>
|
||||||
|
{% endblocktranslate %}
|
||||||
|
</mj-text>
|
||||||
|
</mj-column>
|
||||||
|
</mj-section>
|
||||||
48
src/mail/mjml/partial/header.mjml
Normal file
48
src/mail/mjml/partial/header.mjml
Normal file
@@ -0,0 +1,48 @@
|
|||||||
|
<mj-head>
|
||||||
|
<mj-title>{{ title }}</mj-title>
|
||||||
|
<mj-preview>
|
||||||
|
<!--
|
||||||
|
We load django tags here, in this way there are put within the body in html output
|
||||||
|
so the html-to-text command includes it within its output
|
||||||
|
-->
|
||||||
|
{% load i18n static extra_tags %}
|
||||||
|
{{ title }}
|
||||||
|
</mj-preview>
|
||||||
|
<mj-attributes>
|
||||||
|
<mj-font name="Roboto" href="https://fonts.googleapis.com/css2?family=Roboto:wght@400;700;900&display=swap" />
|
||||||
|
<mj-all
|
||||||
|
font-family="Roboto, -apple-system, BlinkMacSystemFont, 'Segoe UI', Oxygen, Ubuntu, Cantarell, 'Helvetica Neue', sans-serif"
|
||||||
|
font-size="16px"
|
||||||
|
line-height="1.5em"
|
||||||
|
color="#031963"
|
||||||
|
/>
|
||||||
|
<mj-class name="text--small" font-size="0.875rem" />
|
||||||
|
<mj-class name="bg--blue-100" background-color="#EDF5FA" />
|
||||||
|
</mj-attributes>
|
||||||
|
<mj-style>
|
||||||
|
/* Reset */
|
||||||
|
h1, h2, h3, h4, h5, h6, p {
|
||||||
|
margin: 0;
|
||||||
|
padding: 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
a {
|
||||||
|
color: inherit;
|
||||||
|
}
|
||||||
|
</mj-style>
|
||||||
|
<mj-style>
|
||||||
|
/* Global styles */
|
||||||
|
h1 {
|
||||||
|
color: #055FD2;
|
||||||
|
font-size: 2rem;
|
||||||
|
line-height: 1em;
|
||||||
|
font-weight: 700;
|
||||||
|
}
|
||||||
|
|
||||||
|
.wrapper {
|
||||||
|
background: #FFFFFF;
|
||||||
|
border-radius: 0 0 6px 6px;
|
||||||
|
box-shadow: 0 0 6px rgba(2 117 180 / 0.3);
|
||||||
|
}
|
||||||
|
</mj-style>
|
||||||
|
</mj-head>
|
||||||
22
src/mail/package.json
Normal file
22
src/mail/package.json
Normal file
@@ -0,0 +1,22 @@
|
|||||||
|
{
|
||||||
|
"name": "mail_mjml",
|
||||||
|
"version": "1.1.0",
|
||||||
|
"description": "An util to generate html and text django's templates from mjml templates",
|
||||||
|
"type": "module",
|
||||||
|
"dependencies": {
|
||||||
|
"@html-to/text-cli": "0.5.4",
|
||||||
|
"mjml": "4.14.1"
|
||||||
|
},
|
||||||
|
"private": true,
|
||||||
|
"scripts": {
|
||||||
|
"build-mjml-to-html": "./bin/mjml-to-html",
|
||||||
|
"build-html-to-plain-text": "./bin/html-to-plain-text",
|
||||||
|
"build": "yarn build-mjml-to-html; yarn build-html-to-plain-text;"
|
||||||
|
},
|
||||||
|
"volta": {
|
||||||
|
"node": "16.15.1"
|
||||||
|
},
|
||||||
|
"repository": "https://github.com/numerique-gouv/publish",
|
||||||
|
"author": "DINUM",
|
||||||
|
"license": "MIT"
|
||||||
|
}
|
||||||
1126
src/mail/yarn.lock
Normal file
1126
src/mail/yarn.lock
Normal file
File diff suppressed because it is too large
Load Diff
25
src/tsclient/package.json
Normal file
25
src/tsclient/package.json
Normal file
@@ -0,0 +1,25 @@
|
|||||||
|
{
|
||||||
|
"name": "publish-openapi-client-ts",
|
||||||
|
"version": "1.1.0",
|
||||||
|
"private": true,
|
||||||
|
"description": "Tool to generate Typescript API client for the publish application.",
|
||||||
|
"scripts": {
|
||||||
|
"generate:api:client:local": "./scripts/openapi-typescript-codegen/generate_api_client_local.sh $1"
|
||||||
|
},
|
||||||
|
"repository": {
|
||||||
|
"type": "git",
|
||||||
|
"url": "git+https://github.com/numerique-gouv/publish.git"
|
||||||
|
},
|
||||||
|
"author": {
|
||||||
|
"name": "DINUM",
|
||||||
|
"email": "dev@mail.numerique.gouv.fr"
|
||||||
|
},
|
||||||
|
"license": "MIT",
|
||||||
|
"bugs": {
|
||||||
|
"url": "https://github.com/numerique-gouv/publish/issues"
|
||||||
|
},
|
||||||
|
"homepage": "https://github.com/numerique-gouv/publish#readme",
|
||||||
|
"devDependencies": {
|
||||||
|
"openapi-typescript-codegen": "0.25.0"
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,8 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
|
||||||
|
# usage: yarn generate:api:client:local [--output]
|
||||||
|
|
||||||
|
# OPTIONS:
|
||||||
|
# --output the path folder where types will be generated
|
||||||
|
|
||||||
|
openapi --input http://app-dev:8000/v1.0/swagger.json --output $1 --indent='2' --name ApiClientpublish --useOptions
|
||||||
133
src/tsclient/yarn.lock
Normal file
133
src/tsclient/yarn.lock
Normal file
@@ -0,0 +1,133 @@
|
|||||||
|
# THIS IS AN AUTOGENERATED FILE. DO NOT EDIT THIS FILE DIRECTLY.
|
||||||
|
# yarn lockfile v1
|
||||||
|
|
||||||
|
|
||||||
|
"@apidevtools/json-schema-ref-parser@9.0.9":
|
||||||
|
version "9.0.9"
|
||||||
|
resolved "https://registry.yarnpkg.com/@apidevtools/json-schema-ref-parser/-/json-schema-ref-parser-9.0.9.tgz#d720f9256e3609621280584f2b47ae165359268b"
|
||||||
|
integrity sha512-GBD2Le9w2+lVFoc4vswGI/TjkNIZSVp7+9xPf+X3uidBfWnAeUWmquteSyt0+VCrhNMWj/FTABISQrD3Z/YA+w==
|
||||||
|
dependencies:
|
||||||
|
"@jsdevtools/ono" "^7.1.3"
|
||||||
|
"@types/json-schema" "^7.0.6"
|
||||||
|
call-me-maybe "^1.0.1"
|
||||||
|
js-yaml "^4.1.0"
|
||||||
|
|
||||||
|
"@jsdevtools/ono@^7.1.3":
|
||||||
|
version "7.1.3"
|
||||||
|
resolved "https://registry.yarnpkg.com/@jsdevtools/ono/-/ono-7.1.3.tgz#9df03bbd7c696a5c58885c34aa06da41c8543796"
|
||||||
|
integrity sha512-4JQNk+3mVzK3xh2rqd6RB4J46qUR19azEHBneZyTZM+c456qOrbbM/5xcR8huNCCcbVt7+UmizG6GuUvPvKUYg==
|
||||||
|
|
||||||
|
"@types/json-schema@^7.0.6":
|
||||||
|
version "7.0.11"
|
||||||
|
resolved "https://registry.yarnpkg.com/@types/json-schema/-/json-schema-7.0.11.tgz#d421b6c527a3037f7c84433fd2c4229e016863d3"
|
||||||
|
integrity sha512-wOuvG1SN4Us4rez+tylwwwCV1psiNVOkJeM3AUWUNWg/jDQY2+HE/444y5gc+jBmRqASOm2Oeh5c1axHobwRKQ==
|
||||||
|
|
||||||
|
argparse@^2.0.1:
|
||||||
|
version "2.0.1"
|
||||||
|
resolved "https://registry.yarnpkg.com/argparse/-/argparse-2.0.1.tgz#246f50f3ca78a3240f6c997e8a9bd1eac49e4b38"
|
||||||
|
integrity sha512-8+9WqebbFzpX9OR+Wa6O29asIogeRMzcGtAINdpMHHyAg10f05aSFVBbcEqGf/PXw1EjAZ+q2/bEBg3DvurK3Q==
|
||||||
|
|
||||||
|
call-me-maybe@^1.0.1:
|
||||||
|
version "1.0.2"
|
||||||
|
resolved "https://registry.yarnpkg.com/call-me-maybe/-/call-me-maybe-1.0.2.tgz#03f964f19522ba643b1b0693acb9152fe2074baa"
|
||||||
|
integrity sha512-HpX65o1Hnr9HH25ojC1YGs7HCQLq0GCOibSaWER0eNpgJ/Z1MZv2mTc7+xh6WOPxbRVcmgbv4hGU+uSQ/2xFZQ==
|
||||||
|
|
||||||
|
camelcase@^6.3.0:
|
||||||
|
version "6.3.0"
|
||||||
|
resolved "https://registry.yarnpkg.com/camelcase/-/camelcase-6.3.0.tgz#5685b95eb209ac9c0c177467778c9c84df58ba9a"
|
||||||
|
integrity sha512-Gmy6FhYlCY7uOElZUSbxo2UCDH8owEk996gkbrpsgGtrJLM3J7jGxl9Ic7Qwwj4ivOE5AWZWRMecDdF7hqGjFA==
|
||||||
|
|
||||||
|
commander@^11.0.0:
|
||||||
|
version "11.0.0"
|
||||||
|
resolved "https://registry.yarnpkg.com/commander/-/commander-11.0.0.tgz#43e19c25dbedc8256203538e8d7e9346877a6f67"
|
||||||
|
integrity sha512-9HMlXtt/BNoYr8ooyjjNRdIilOTkVJXB+GhxMTtOKwk0R4j4lS4NpjuqmRxroBfnfTSHQIHQB7wryHhXarNjmQ==
|
||||||
|
|
||||||
|
fs-extra@^11.1.1:
|
||||||
|
version "11.1.1"
|
||||||
|
resolved "https://registry.yarnpkg.com/fs-extra/-/fs-extra-11.1.1.tgz#da69f7c39f3b002378b0954bb6ae7efdc0876e2d"
|
||||||
|
integrity sha512-MGIE4HOvQCeUCzmlHs0vXpih4ysz4wg9qiSAu6cd42lVwPbTM1TjV7RusoyQqMmk/95gdQZX72u+YW+c3eEpFQ==
|
||||||
|
dependencies:
|
||||||
|
graceful-fs "^4.2.0"
|
||||||
|
jsonfile "^6.0.1"
|
||||||
|
universalify "^2.0.0"
|
||||||
|
|
||||||
|
graceful-fs@^4.1.6, graceful-fs@^4.2.0:
|
||||||
|
version "4.2.10"
|
||||||
|
resolved "https://registry.yarnpkg.com/graceful-fs/-/graceful-fs-4.2.10.tgz#147d3a006da4ca3ce14728c7aefc287c367d7a6c"
|
||||||
|
integrity sha512-9ByhssR2fPVsNZj478qUUbKfmL0+t5BDVyjShtyZZLiK7ZDAArFFfopyOTj0M05wE2tJPisA4iTnnXl2YoPvOA==
|
||||||
|
|
||||||
|
handlebars@^4.7.7:
|
||||||
|
version "4.7.7"
|
||||||
|
resolved "https://registry.yarnpkg.com/handlebars/-/handlebars-4.7.7.tgz#9ce33416aad02dbd6c8fafa8240d5d98004945a1"
|
||||||
|
integrity sha512-aAcXm5OAfE/8IXkcZvCepKU3VzW1/39Fb5ZuqMtgI/hT8X2YgoMvBY5dLhq/cpOvw7Lk1nK/UF71aLG/ZnVYRA==
|
||||||
|
dependencies:
|
||||||
|
minimist "^1.2.5"
|
||||||
|
neo-async "^2.6.0"
|
||||||
|
source-map "^0.6.1"
|
||||||
|
wordwrap "^1.0.0"
|
||||||
|
optionalDependencies:
|
||||||
|
uglify-js "^3.1.4"
|
||||||
|
|
||||||
|
js-yaml@^4.1.0:
|
||||||
|
version "4.1.0"
|
||||||
|
resolved "https://registry.yarnpkg.com/js-yaml/-/js-yaml-4.1.0.tgz#c1fb65f8f5017901cdd2c951864ba18458a10602"
|
||||||
|
integrity sha512-wpxZs9NoxZaJESJGIZTyDEaYpl0FKSA+FB9aJiyemKhMwkxQg63h4T1KJgUGHpTqPDNRcmmYLugrRjJlBtWvRA==
|
||||||
|
dependencies:
|
||||||
|
argparse "^2.0.1"
|
||||||
|
|
||||||
|
json-schema-ref-parser@^9.0.9:
|
||||||
|
version "9.0.9"
|
||||||
|
resolved "https://registry.yarnpkg.com/json-schema-ref-parser/-/json-schema-ref-parser-9.0.9.tgz#66ea538e7450b12af342fa3d5b8458bc1e1e013f"
|
||||||
|
integrity sha512-qcP2lmGy+JUoQJ4DOQeLaZDqH9qSkeGCK3suKWxJXS82dg728Mn3j97azDMaOUmJAN4uCq91LdPx4K7E8F1a7Q==
|
||||||
|
dependencies:
|
||||||
|
"@apidevtools/json-schema-ref-parser" "9.0.9"
|
||||||
|
|
||||||
|
jsonfile@^6.0.1:
|
||||||
|
version "6.1.0"
|
||||||
|
resolved "https://registry.yarnpkg.com/jsonfile/-/jsonfile-6.1.0.tgz#bc55b2634793c679ec6403094eb13698a6ec0aae"
|
||||||
|
integrity sha512-5dgndWOriYSm5cnYaJNhalLNDKOqFwyDB/rr1E9ZsGciGvKPs8R2xYGCacuf3z6K1YKDz182fd+fY3cn3pMqXQ==
|
||||||
|
dependencies:
|
||||||
|
universalify "^2.0.0"
|
||||||
|
optionalDependencies:
|
||||||
|
graceful-fs "^4.1.6"
|
||||||
|
|
||||||
|
minimist@^1.2.5:
|
||||||
|
version "1.2.8"
|
||||||
|
resolved "https://registry.yarnpkg.com/minimist/-/minimist-1.2.8.tgz#c1a464e7693302e082a075cee0c057741ac4772c"
|
||||||
|
integrity sha512-2yyAR8qBkN3YuheJanUpWC5U3bb5osDywNB8RzDVlDwDHbocAJveqqj1u8+SVD7jkWT4yvsHCpWqqWqAxb0zCA==
|
||||||
|
|
||||||
|
neo-async@^2.6.0:
|
||||||
|
version "2.6.2"
|
||||||
|
resolved "https://registry.yarnpkg.com/neo-async/-/neo-async-2.6.2.tgz#b4aafb93e3aeb2d8174ca53cf163ab7d7308305f"
|
||||||
|
integrity sha512-Yd3UES5mWCSqR+qNT93S3UoYUkqAZ9lLg8a7g9rimsWmYGK8cVToA4/sF3RrshdyV3sAGMXVUmpMYOw+dLpOuw==
|
||||||
|
|
||||||
|
openapi-typescript-codegen@0.25.0:
|
||||||
|
version "0.25.0"
|
||||||
|
resolved "https://registry.yarnpkg.com/openapi-typescript-codegen/-/openapi-typescript-codegen-0.25.0.tgz#0cb028f54b33b0a63bd9da3756c1c41b4e1a70e2"
|
||||||
|
integrity sha512-nN/TnIcGbP58qYgwEEy5FrAAjePcYgfMaCe3tsmYyTgI3v4RR9v8os14L+LEWDvV50+CmqiyTzRkKKtJeb6Ybg==
|
||||||
|
dependencies:
|
||||||
|
camelcase "^6.3.0"
|
||||||
|
commander "^11.0.0"
|
||||||
|
fs-extra "^11.1.1"
|
||||||
|
handlebars "^4.7.7"
|
||||||
|
json-schema-ref-parser "^9.0.9"
|
||||||
|
|
||||||
|
source-map@^0.6.1:
|
||||||
|
version "0.6.1"
|
||||||
|
resolved "https://registry.yarnpkg.com/source-map/-/source-map-0.6.1.tgz#74722af32e9614e9c287a8d0bbde48b5e2f1a263"
|
||||||
|
integrity sha512-UjgapumWlbMhkBgzT7Ykc5YXUT46F0iKu8SGXq0bcwP5dz/h0Plj6enJqjz1Zbq2l5WaqYnrVbwWOWMyF3F47g==
|
||||||
|
|
||||||
|
uglify-js@^3.1.4:
|
||||||
|
version "3.17.4"
|
||||||
|
resolved "https://registry.yarnpkg.com/uglify-js/-/uglify-js-3.17.4.tgz#61678cf5fa3f5b7eb789bb345df29afb8257c22c"
|
||||||
|
integrity sha512-T9q82TJI9e/C1TAxYvfb16xO120tMVFZrGA3f9/P4424DNu6ypK103y0GPFVa17yotwSyZW5iYXgjYHkGrJW/g==
|
||||||
|
|
||||||
|
universalify@^2.0.0:
|
||||||
|
version "2.0.0"
|
||||||
|
resolved "https://registry.yarnpkg.com/universalify/-/universalify-2.0.0.tgz#75a4984efedc4b08975c5aeb73f530d02df25717"
|
||||||
|
integrity sha512-hAZsKq7Yy11Zu1DE0OzWjw7nnLZmJZYTDZZyEFHZdUhV8FkH5MCfoU1XMaxXovpyW5nq5scPqq0ZDP9Zyl04oQ==
|
||||||
|
|
||||||
|
wordwrap@^1.0.0:
|
||||||
|
version "1.0.0"
|
||||||
|
resolved "https://registry.yarnpkg.com/wordwrap/-/wordwrap-1.0.0.tgz#27584810891456a4171c8d0226441ade90cbcaeb"
|
||||||
|
integrity sha512-gvVzJFlPycKc5dZN4yPkP8w7Dc37BtP1yczEneOb4uq34pXZcvrtRTmWV8W+Ume+XCxKgbjM+nevkyFPMybd4Q==
|
||||||
Reference in New Issue
Block a user