✨(project) Django boilerplate
This commit introduces a boilerplate inspired by https://github.com/numerique-gouv/impress. The code has been cleaned to remove unnecessary Impress logic and dependencies. Changes made: - Removed Minio, WebRTC, and create bucket from the stack. - Removed the Next.js frontend (it will be replaced by Vite). - Cleaned up impress-specific backend logics. The whole stack remains functional: - All tests pass. - Linter checks pass. - Agent Connexion sources are already set-up. Why clear out the code? To adhere to the KISS principle, we aim to maintain a minimalist codebase. Cloning Impress allowed us to quickly inherit its code quality tools and deployment configurations for staging, pre-production, and production environments. What’s broken? - The tsclient is not functional anymore. - Some make commands need to be fixed. - Helm sources are outdated. - Naming across the project sources are inconsistent (impress, visio, etc.) - CI is not configured properly. This list might be incomplete. Let's grind it.
This commit is contained in:
committed by
lebaudantoine
parent
2d81979b0a
commit
5b1a2b20de
36
.dockerignore
Normal file
36
.dockerignore
Normal file
@@ -0,0 +1,36 @@
|
|||||||
|
# Python
|
||||||
|
__pycache__
|
||||||
|
*.pyc
|
||||||
|
**/__pycache__
|
||||||
|
**/*.pyc
|
||||||
|
venv
|
||||||
|
.venv
|
||||||
|
|
||||||
|
# System-specific files
|
||||||
|
.DS_Store
|
||||||
|
**/.DS_Store
|
||||||
|
|
||||||
|
# Docker
|
||||||
|
docker compose.*
|
||||||
|
env.d
|
||||||
|
|
||||||
|
# Docs
|
||||||
|
docs
|
||||||
|
*.md
|
||||||
|
*.log
|
||||||
|
|
||||||
|
# Development/test cache & configurations
|
||||||
|
data
|
||||||
|
.cache
|
||||||
|
.circleci
|
||||||
|
.git
|
||||||
|
.vscode
|
||||||
|
.iml
|
||||||
|
.idea
|
||||||
|
db.sqlite3
|
||||||
|
.mypy_cache
|
||||||
|
.pylint.d
|
||||||
|
.pytest_cache
|
||||||
|
|
||||||
|
# Frontend
|
||||||
|
node_modules
|
||||||
6
.github/ISSUE_TEMPLATE.md
vendored
Normal file
6
.github/ISSUE_TEMPLATE.md
vendored
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
<!---
|
||||||
|
Thanks for filing an issue 😄 ! Before you submit, please read the following:
|
||||||
|
|
||||||
|
Check the other issue templates if you are trying to submit a bug report, feature request, or question
|
||||||
|
Search open/closed issues before submitting since someone might have asked the same thing before!
|
||||||
|
-->
|
||||||
28
.github/ISSUE_TEMPLATE/Bug_report.md
vendored
Normal file
28
.github/ISSUE_TEMPLATE/Bug_report.md
vendored
Normal file
@@ -0,0 +1,28 @@
|
|||||||
|
---
|
||||||
|
name: 🐛 Bug Report
|
||||||
|
about: If something is not working as expected 🤔.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Bug Report
|
||||||
|
|
||||||
|
**Problematic behavior**
|
||||||
|
A clear and concise description of the behavior.
|
||||||
|
|
||||||
|
**Expected behavior/code**
|
||||||
|
A clear and concise description of what you expected to happen (or code).
|
||||||
|
|
||||||
|
**Steps to Reproduce**
|
||||||
|
1. Do this...
|
||||||
|
2. Then this...
|
||||||
|
3. And then the bug happens!
|
||||||
|
|
||||||
|
**Environment**
|
||||||
|
- Impress version:
|
||||||
|
- Platform:
|
||||||
|
|
||||||
|
**Possible Solution**
|
||||||
|
<!--- Only if you have suggestions on a fix for the bug -->
|
||||||
|
|
||||||
|
**Additional context/Screenshots**
|
||||||
|
Add any other context about the problem here. If applicable, add screenshots to help explain.
|
||||||
23
.github/ISSUE_TEMPLATE/Feature_request.md
vendored
Normal file
23
.github/ISSUE_TEMPLATE/Feature_request.md
vendored
Normal file
@@ -0,0 +1,23 @@
|
|||||||
|
---
|
||||||
|
name: ✨ Feature Request
|
||||||
|
about: I have a suggestion (and may want to build it 💪)!
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Feature Request
|
||||||
|
|
||||||
|
**Is your feature request related to a problem or unsupported use case? Please describe.**
|
||||||
|
A clear and concise description of what the problem is. For example: I need to do some task and I have an issue...
|
||||||
|
|
||||||
|
**Describe the solution you'd like**
|
||||||
|
A clear and concise description of what you want to happen. Add any considered drawbacks.
|
||||||
|
|
||||||
|
**Describe alternatives you've considered**
|
||||||
|
A clear and concise description of any alternative solutions or features you've considered.
|
||||||
|
|
||||||
|
**Discovery, Documentation, Adoption, Migration Strategy**
|
||||||
|
If you can, explain how users will be able to use this and possibly write out a version the docs (if applicable).
|
||||||
|
Maybe a screenshot or design?
|
||||||
|
|
||||||
|
**Do you want to work on it through a Pull Request?**
|
||||||
|
<!-- Make sure to coordinate with us before you spend too much time working on an implementation! -->
|
||||||
22
.github/ISSUE_TEMPLATE/Support_question.md
vendored
Normal file
22
.github/ISSUE_TEMPLATE/Support_question.md
vendored
Normal file
@@ -0,0 +1,22 @@
|
|||||||
|
---
|
||||||
|
name: 🤗 Support Question
|
||||||
|
about: If you have a question 💬, or something was not clear from the docs!
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
<!-- ^ Click "Preview" for a nicer view! ^
|
||||||
|
We primarily use GitHub as an issue tracker. If however you're encountering an issue not covered in the docs, we may be able to help! -->
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
Please make sure you have read our [main Readme](https://github.com/numerique-gouv/impress).
|
||||||
|
|
||||||
|
Also make sure it was not already answered in [an open or close issue](https://github.com/numerique-gouv/impress/issues).
|
||||||
|
|
||||||
|
If your question was not covered, and you feel like it should be, fire away! We'd love to improve our docs! 👌
|
||||||
|
|
||||||
|
**Topic**
|
||||||
|
What's the general area of your question: for example, docker setup, database schema, search functionality,...
|
||||||
|
|
||||||
|
**Question**
|
||||||
|
Try to be as specific as possible so we can help you as best we can. Please be patient 🙏
|
||||||
11
.github/PULL_REQUEST_TEMPLATE.md
vendored
Normal file
11
.github/PULL_REQUEST_TEMPLATE.md
vendored
Normal file
@@ -0,0 +1,11 @@
|
|||||||
|
## Purpose
|
||||||
|
|
||||||
|
Description...
|
||||||
|
|
||||||
|
|
||||||
|
## Proposal
|
||||||
|
|
||||||
|
Description...
|
||||||
|
|
||||||
|
- [] item 1...
|
||||||
|
- [] item 2...
|
||||||
52
.github/workflows/deploy.yml
vendored
Normal file
52
.github/workflows/deploy.yml
vendored
Normal file
@@ -0,0 +1,52 @@
|
|||||||
|
name: Deploy
|
||||||
|
|
||||||
|
on:
|
||||||
|
push:
|
||||||
|
tags:
|
||||||
|
- 'preprod'
|
||||||
|
- 'production'
|
||||||
|
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
notify-argocd:
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
steps:
|
||||||
|
-
|
||||||
|
uses: actions/create-github-app-token@v1
|
||||||
|
id: app-token
|
||||||
|
with:
|
||||||
|
app-id: ${{ secrets.APP_ID }}
|
||||||
|
private-key: ${{ secrets.PRIVATE_KEY }}
|
||||||
|
owner: ${{ github.repository_owner }}
|
||||||
|
repositories: "impress,secrets"
|
||||||
|
-
|
||||||
|
name: Checkout repository
|
||||||
|
uses: actions/checkout@v2
|
||||||
|
with:
|
||||||
|
submodules: recursive
|
||||||
|
token: ${{ steps.app-token.outputs.token }}
|
||||||
|
-
|
||||||
|
name: Load sops secrets
|
||||||
|
uses: rouja/actions-sops@main
|
||||||
|
with:
|
||||||
|
secret-file: secrets/numerique-gouv/impress/secrets.enc.env
|
||||||
|
age-key: ${{ secrets.SOPS_PRIVATE }}
|
||||||
|
-
|
||||||
|
name: Call argocd github webhook
|
||||||
|
run: |
|
||||||
|
data='{"ref": "'$GITHUB_REF'","repository": {"html_url":"'$GITHUB_SERVER_URL'/'$GITHUB_REPOSITORY'"}}'
|
||||||
|
sig=$(echo -n ${data} | openssl dgst -sha1 -hmac ''${ARGOCD_WEBHOOK_SECRET}'' | awk '{print "X-Hub-Signature: sha1="$2}')
|
||||||
|
curl -X POST -H 'X-GitHub-Event:push' -H "Content-Type: application/json" -H "${sig}" --data "${data}" $ARGOCD_WEBHOOK_URL
|
||||||
|
sig=$(echo -n ${data} | openssl dgst -sha1 -hmac ''${ARGOCD_PRODUCTION_WEBHOOK_SECRET}'' | awk '{print "X-Hub-Signature: sha1="$2}')
|
||||||
|
curl -X POST -H 'X-GitHub-Event:push' -H "Content-Type: application/json" -H "${sig}" --data "${data}" $ARGOCD_PRODUCTION_WEBHOOK_URL
|
||||||
|
|
||||||
|
start-test-on-preprod:
|
||||||
|
needs:
|
||||||
|
- notify-argocd
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
if: startsWith(github.event.ref, 'refs/tags/preprod')
|
||||||
|
steps:
|
||||||
|
-
|
||||||
|
name: Debug
|
||||||
|
run: |
|
||||||
|
echo "Start test when preprod is ready"
|
||||||
185
.github/workflows/docker-hub.yml
vendored
Normal file
185
.github/workflows/docker-hub.yml
vendored
Normal file
@@ -0,0 +1,185 @@
|
|||||||
|
name: Docker Hub Workflow
|
||||||
|
|
||||||
|
on:
|
||||||
|
workflow_dispatch:
|
||||||
|
push:
|
||||||
|
branches:
|
||||||
|
- 'main'
|
||||||
|
tags:
|
||||||
|
- 'v*'
|
||||||
|
pull_request:
|
||||||
|
branches:
|
||||||
|
- 'main'
|
||||||
|
|
||||||
|
env:
|
||||||
|
DOCKER_USER: 1001:127
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
build-and-push-backend:
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
steps:
|
||||||
|
-
|
||||||
|
uses: actions/create-github-app-token@v1
|
||||||
|
id: app-token
|
||||||
|
with:
|
||||||
|
app-id: ${{ secrets.APP_ID }}
|
||||||
|
private-key: ${{ secrets.PRIVATE_KEY }}
|
||||||
|
owner: ${{ github.repository_owner }}
|
||||||
|
repositories: "impress,secrets"
|
||||||
|
-
|
||||||
|
name: Checkout repository
|
||||||
|
uses: actions/checkout@v2
|
||||||
|
with:
|
||||||
|
submodules: recursive
|
||||||
|
token: ${{ steps.app-token.outputs.token }}
|
||||||
|
-
|
||||||
|
name: Load sops secrets
|
||||||
|
uses: rouja/actions-sops@main
|
||||||
|
with:
|
||||||
|
secret-file: secrets/numerique-gouv/impress/secrets.enc.env
|
||||||
|
age-key: ${{ secrets.SOPS_PRIVATE }}
|
||||||
|
-
|
||||||
|
name: Docker meta
|
||||||
|
id: meta
|
||||||
|
uses: docker/metadata-action@v5
|
||||||
|
with:
|
||||||
|
images: lasuite/impress-backend
|
||||||
|
-
|
||||||
|
name: Login to DockerHub
|
||||||
|
if: github.event_name != 'pull_request'
|
||||||
|
run: echo "$DOCKER_HUB_PASSWORD" | docker login -u "$DOCKER_HUB_USER" --password-stdin
|
||||||
|
-
|
||||||
|
name: Build and push
|
||||||
|
uses: docker/build-push-action@v5
|
||||||
|
with:
|
||||||
|
context: .
|
||||||
|
target: backend-production
|
||||||
|
build-args: DOCKER_USER=${{ env.DOCKER_USER }}:-1000
|
||||||
|
push: ${{ github.event_name != 'pull_request' }}
|
||||||
|
tags: ${{ steps.meta.outputs.tags }}
|
||||||
|
labels: ${{ steps.meta.outputs.labels }}
|
||||||
|
|
||||||
|
build-and-push-frontend:
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
steps:
|
||||||
|
-
|
||||||
|
uses: actions/create-github-app-token@v1
|
||||||
|
id: app-token
|
||||||
|
with:
|
||||||
|
app-id: ${{ secrets.APP_ID }}
|
||||||
|
private-key: ${{ secrets.PRIVATE_KEY }}
|
||||||
|
owner: ${{ github.repository_owner }}
|
||||||
|
repositories: "impress,secrets"
|
||||||
|
-
|
||||||
|
name: Checkout repository
|
||||||
|
uses: actions/checkout@v2
|
||||||
|
with:
|
||||||
|
submodules: recursive
|
||||||
|
token: ${{ steps.app-token.outputs.token }}
|
||||||
|
-
|
||||||
|
name: Load sops secrets
|
||||||
|
uses: rouja/actions-sops@main
|
||||||
|
with:
|
||||||
|
secret-file: secrets/numerique-gouv/impress/secrets.enc.env
|
||||||
|
age-key: ${{ secrets.SOPS_PRIVATE }}
|
||||||
|
-
|
||||||
|
name: Docker meta
|
||||||
|
id: meta
|
||||||
|
uses: docker/metadata-action@v5
|
||||||
|
with:
|
||||||
|
images: lasuite/impress-frontend
|
||||||
|
-
|
||||||
|
name: Login to DockerHub
|
||||||
|
if: github.event_name != 'pull_request'
|
||||||
|
run: echo "$DOCKER_HUB_PASSWORD" | docker login -u "$DOCKER_HUB_USER" --password-stdin
|
||||||
|
-
|
||||||
|
name: Build and push
|
||||||
|
uses: docker/build-push-action@v5
|
||||||
|
with:
|
||||||
|
context: .
|
||||||
|
file: ./src/frontend/Dockerfile
|
||||||
|
target: frontend-production
|
||||||
|
build-args: DOCKER_USER=${{ env.DOCKER_USER }}:-1000
|
||||||
|
push: ${{ github.event_name != 'pull_request' }}
|
||||||
|
tags: ${{ steps.meta.outputs.tags }}
|
||||||
|
labels: ${{ steps.meta.outputs.labels }}
|
||||||
|
|
||||||
|
build-and-push-y-webrtc-signaling:
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
steps:
|
||||||
|
-
|
||||||
|
uses: actions/create-github-app-token@v1
|
||||||
|
id: app-token
|
||||||
|
with:
|
||||||
|
app-id: ${{ secrets.APP_ID }}
|
||||||
|
private-key: ${{ secrets.PRIVATE_KEY }}
|
||||||
|
owner: ${{ github.repository_owner }}
|
||||||
|
repositories: "impress,secrets"
|
||||||
|
-
|
||||||
|
name: Checkout repository
|
||||||
|
uses: actions/checkout@v2
|
||||||
|
with:
|
||||||
|
submodules: recursive
|
||||||
|
token: ${{ steps.app-token.outputs.token }}
|
||||||
|
-
|
||||||
|
name: Load sops secrets
|
||||||
|
uses: rouja/actions-sops@main
|
||||||
|
with:
|
||||||
|
secret-file: secrets/numerique-gouv/impress/secrets.enc.env
|
||||||
|
age-key: ${{ secrets.SOPS_PRIVATE }}
|
||||||
|
-
|
||||||
|
name: Docker meta
|
||||||
|
id: meta
|
||||||
|
uses: docker/metadata-action@v5
|
||||||
|
with:
|
||||||
|
images: lasuite/impress-y-webrtc-signaling
|
||||||
|
-
|
||||||
|
name: Login to DockerHub
|
||||||
|
if: github.event_name != 'pull_request'
|
||||||
|
run: echo "$DOCKER_HUB_PASSWORD" | docker login -u "$DOCKER_HUB_USER" --password-stdin
|
||||||
|
-
|
||||||
|
name: Build and push
|
||||||
|
uses: docker/build-push-action@v5
|
||||||
|
with:
|
||||||
|
context: .
|
||||||
|
file: ./src/frontend/Dockerfile
|
||||||
|
target: y-webrtc-signaling
|
||||||
|
build-args: DOCKER_USER=${{ env.DOCKER_USER }}:-1000
|
||||||
|
push: ${{ github.event_name != 'pull_request' }}
|
||||||
|
tags: ${{ steps.meta.outputs.tags }}
|
||||||
|
labels: ${{ steps.meta.outputs.labels }}
|
||||||
|
|
||||||
|
notify-argocd:
|
||||||
|
needs:
|
||||||
|
- build-and-push-frontend
|
||||||
|
- build-and-push-backend
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
if: |
|
||||||
|
github.event_name != 'pull_request'
|
||||||
|
steps:
|
||||||
|
-
|
||||||
|
uses: actions/create-github-app-token@v1
|
||||||
|
id: app-token
|
||||||
|
with:
|
||||||
|
app-id: ${{ secrets.APP_ID }}
|
||||||
|
private-key: ${{ secrets.PRIVATE_KEY }}
|
||||||
|
owner: ${{ github.repository_owner }}
|
||||||
|
repositories: "impress,secrets"
|
||||||
|
-
|
||||||
|
name: Checkout repository
|
||||||
|
uses: actions/checkout@v2
|
||||||
|
with:
|
||||||
|
submodules: recursive
|
||||||
|
token: ${{ steps.app-token.outputs.token }}
|
||||||
|
-
|
||||||
|
name: Load sops secrets
|
||||||
|
uses: rouja/actions-sops@main
|
||||||
|
with:
|
||||||
|
secret-file: secrets/numerique-gouv/impress/secrets.enc.env
|
||||||
|
age-key: ${{ secrets.SOPS_PRIVATE }}
|
||||||
|
-
|
||||||
|
name: Call argocd github webhook
|
||||||
|
run: |
|
||||||
|
data='{"ref": "'$GITHUB_REF'","repository": {"html_url":"'$GITHUB_SERVER_URL'/'$GITHUB_REPOSITORY'"}}'
|
||||||
|
sig=$(echo -n ${data} | openssl dgst -sha1 -hmac ''${ARGOCD_WEBHOOK_SECRET}'' | awk '{print "X-Hub-Signature: sha1="$2}')
|
||||||
|
curl -X POST -H 'X-GitHub-Event:push' -H "Content-Type: application/json" -H "${sig}" --data "${data}" $ARGOCD_WEBHOOK_URL
|
||||||
171
.github/workflows/impress-frontend.yml
vendored
Normal file
171
.github/workflows/impress-frontend.yml
vendored
Normal file
@@ -0,0 +1,171 @@
|
|||||||
|
name: impress Workflow
|
||||||
|
|
||||||
|
on:
|
||||||
|
push:
|
||||||
|
branches:
|
||||||
|
- main
|
||||||
|
pull_request:
|
||||||
|
branches:
|
||||||
|
- "*"
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
install-front:
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
|
||||||
|
steps:
|
||||||
|
- name: Checkout repository
|
||||||
|
uses: actions/checkout@v4
|
||||||
|
|
||||||
|
- name: Setup Node.js
|
||||||
|
uses: actions/setup-node@v4
|
||||||
|
with:
|
||||||
|
node-version: "18.x"
|
||||||
|
|
||||||
|
- name: Restore the frontend cache
|
||||||
|
uses: actions/cache@v4
|
||||||
|
id: front-node_modules
|
||||||
|
with:
|
||||||
|
path: "src/frontend/**/node_modules"
|
||||||
|
key: front-node_modules-${{ hashFiles('src/frontend/**/yarn.lock') }}
|
||||||
|
|
||||||
|
- name: Install dependencies
|
||||||
|
if: steps.front-node_modules.outputs.cache-hit != 'true'
|
||||||
|
run: cd src/frontend/ && yarn install --frozen-lockfile
|
||||||
|
|
||||||
|
- name: Cache install frontend
|
||||||
|
if: steps.front-node_modules.outputs.cache-hit != 'true'
|
||||||
|
uses: actions/cache@v4
|
||||||
|
with:
|
||||||
|
path: "src/frontend/**/node_modules"
|
||||||
|
key: front-node_modules-${{ hashFiles('src/frontend/**/yarn.lock') }}
|
||||||
|
|
||||||
|
build-front:
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
needs: install-front
|
||||||
|
steps:
|
||||||
|
- name: Checkout repository
|
||||||
|
uses: actions/checkout@v4
|
||||||
|
|
||||||
|
- name: Restore the frontend cache
|
||||||
|
uses: actions/cache@v4
|
||||||
|
id: front-node_modules
|
||||||
|
with:
|
||||||
|
path: "src/frontend/**/node_modules"
|
||||||
|
key: front-node_modules-${{ hashFiles('src/frontend/**/yarn.lock') }}
|
||||||
|
|
||||||
|
- name: Build CI App
|
||||||
|
run: cd src/frontend/ && yarn ci:build
|
||||||
|
|
||||||
|
- name: Cache build frontend
|
||||||
|
uses: actions/cache@v4
|
||||||
|
with:
|
||||||
|
path: src/frontend/apps/impress/out/
|
||||||
|
key: build-front-${{ github.run_id }}
|
||||||
|
|
||||||
|
test-front:
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
needs: install-front
|
||||||
|
steps:
|
||||||
|
- name: Checkout repository
|
||||||
|
uses: actions/checkout@v4
|
||||||
|
|
||||||
|
- name: Restore the frontend cache
|
||||||
|
uses: actions/cache@v4
|
||||||
|
id: front-node_modules
|
||||||
|
with:
|
||||||
|
path: "src/frontend/**/node_modules"
|
||||||
|
key: front-node_modules-${{ hashFiles('src/frontend/**/yarn.lock') }}
|
||||||
|
|
||||||
|
- name: Test App
|
||||||
|
run: cd src/frontend/ && yarn app:test
|
||||||
|
|
||||||
|
lint-front:
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
needs: install-front
|
||||||
|
steps:
|
||||||
|
- name: Checkout repository
|
||||||
|
uses: actions/checkout@v4
|
||||||
|
|
||||||
|
- name: Restore the frontend cache
|
||||||
|
uses: actions/cache@v4
|
||||||
|
id: front-node_modules
|
||||||
|
with:
|
||||||
|
path: "src/frontend/**/node_modules"
|
||||||
|
key: front-node_modules-${{ hashFiles('src/frontend/**/yarn.lock') }}
|
||||||
|
|
||||||
|
- name: Check linting
|
||||||
|
run: cd src/frontend/ && yarn lint
|
||||||
|
|
||||||
|
test-e2e:
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
needs: build-front
|
||||||
|
timeout-minutes: 15
|
||||||
|
steps:
|
||||||
|
- name: Checkout repository
|
||||||
|
uses: actions/checkout@v4
|
||||||
|
|
||||||
|
- name: Set services env variables
|
||||||
|
run: |
|
||||||
|
make data/media
|
||||||
|
make create-env-files
|
||||||
|
cat env.d/development/common.e2e.dist >> env.d/development/common
|
||||||
|
|
||||||
|
- name: Restore the mail templates
|
||||||
|
uses: actions/cache@v4
|
||||||
|
id: mail-templates
|
||||||
|
with:
|
||||||
|
path: "src/backend/core/templates/mail"
|
||||||
|
key: mail-templates-${{ hashFiles('src/mail/mjml') }}
|
||||||
|
|
||||||
|
- name: Restore the frontend cache
|
||||||
|
uses: actions/cache@v4
|
||||||
|
id: front-node_modules
|
||||||
|
with:
|
||||||
|
path: "src/frontend/**/node_modules"
|
||||||
|
key: front-node_modules-${{ hashFiles('src/frontend/**/yarn.lock') }}
|
||||||
|
|
||||||
|
- name: Restore the build cache
|
||||||
|
uses: actions/cache@v4
|
||||||
|
id: cache-build
|
||||||
|
with:
|
||||||
|
path: src/frontend/apps/impress/out/
|
||||||
|
key: build-front-${{ github.run_id }}
|
||||||
|
|
||||||
|
- name: Set up Docker Buildx
|
||||||
|
uses: docker/setup-buildx-action@v3
|
||||||
|
|
||||||
|
- name: Build the Docker images
|
||||||
|
uses: docker/bake-action@v4
|
||||||
|
with:
|
||||||
|
targets: |
|
||||||
|
app-dev
|
||||||
|
y-webrtc-signaling
|
||||||
|
load: true
|
||||||
|
set: |
|
||||||
|
*.cache-from=type=gha,scope=cached-stage
|
||||||
|
*.cache-to=type=gha,scope=cached-stage,mode=max
|
||||||
|
|
||||||
|
- name: Start Docker services
|
||||||
|
run: |
|
||||||
|
make run
|
||||||
|
|
||||||
|
- name: Apply DRF migrations
|
||||||
|
run: |
|
||||||
|
make migrate
|
||||||
|
|
||||||
|
- name: Add dummy data
|
||||||
|
run: |
|
||||||
|
make demo FLUSH_ARGS='--no-input'
|
||||||
|
|
||||||
|
- name: Install Playwright Browsers
|
||||||
|
run: cd src/frontend/apps/e2e && yarn install
|
||||||
|
|
||||||
|
- name: Run e2e tests
|
||||||
|
run: cd src/frontend/ && yarn e2e:test
|
||||||
|
|
||||||
|
- uses: actions/upload-artifact@v3
|
||||||
|
if: always()
|
||||||
|
with:
|
||||||
|
name: playwright-report
|
||||||
|
path: src/frontend/apps/e2e/report/
|
||||||
|
retention-days: 7
|
||||||
272
.github/workflows/impress.yml
vendored
Normal file
272
.github/workflows/impress.yml
vendored
Normal file
@@ -0,0 +1,272 @@
|
|||||||
|
name: impress Workflow
|
||||||
|
|
||||||
|
on:
|
||||||
|
push:
|
||||||
|
branches:
|
||||||
|
- main
|
||||||
|
pull_request:
|
||||||
|
branches:
|
||||||
|
- "*"
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
lint-git:
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
if: github.event_name == 'pull_request' # Makes sense only for pull requests
|
||||||
|
steps:
|
||||||
|
- name: Checkout repository
|
||||||
|
uses: actions/checkout@v2
|
||||||
|
with:
|
||||||
|
fetch-depth: 0
|
||||||
|
- name: show
|
||||||
|
run: git log
|
||||||
|
- name: Enforce absence of print statements in code
|
||||||
|
run: |
|
||||||
|
! git diff origin/${{ github.event.pull_request.base.ref }}..HEAD -- . ':(exclude)**/impress.yml' | grep "print("
|
||||||
|
- name: Check absence of fixup commits
|
||||||
|
run: |
|
||||||
|
! git log | grep 'fixup!'
|
||||||
|
- name: Install gitlint
|
||||||
|
run: pip install --user requests gitlint
|
||||||
|
- name: Lint commit messages added to main
|
||||||
|
run: ~/.local/bin/gitlint --commits origin/${{ github.event.pull_request.base.ref }}..HEAD
|
||||||
|
|
||||||
|
check-changelog:
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
if: |
|
||||||
|
contains(github.event.pull_request.labels.*.name, 'noChangeLog') == false &&
|
||||||
|
github.event_name == 'pull_request'
|
||||||
|
steps:
|
||||||
|
- name: Checkout repository
|
||||||
|
uses: actions/checkout@v3
|
||||||
|
with:
|
||||||
|
fetch-depth: 50
|
||||||
|
- name: Check that the CHANGELOG has been modified in the current branch
|
||||||
|
run: git diff --name-only ${{ github.event.pull_request.base.sha }} ${{ github.event.after }} | grep 'CHANGELOG.md'
|
||||||
|
|
||||||
|
lint-changelog:
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
steps:
|
||||||
|
- name: Checkout repository
|
||||||
|
uses: actions/checkout@v2
|
||||||
|
- name: Check CHANGELOG max line length
|
||||||
|
run: |
|
||||||
|
max_line_length=$(cat CHANGELOG.md | grep -Ev "^\[.*\]: https://github.com" | wc -L)
|
||||||
|
if [ $max_line_length -ge 80 ]; then
|
||||||
|
echo "ERROR: CHANGELOG has lines longer than 80 characters."
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
build-mails:
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
defaults:
|
||||||
|
run:
|
||||||
|
working-directory: src/mail
|
||||||
|
steps:
|
||||||
|
- name: Checkout repository
|
||||||
|
uses: actions/checkout@v4
|
||||||
|
|
||||||
|
- name: Install Node.js
|
||||||
|
uses: actions/setup-node@v4
|
||||||
|
with:
|
||||||
|
node-version: "18"
|
||||||
|
|
||||||
|
- name: Restore the mail templates
|
||||||
|
uses: actions/cache@v4
|
||||||
|
id: mail-templates
|
||||||
|
with:
|
||||||
|
path: "src/backend/core/templates/mail"
|
||||||
|
key: mail-templates-${{ hashFiles('src/mail/mjml') }}
|
||||||
|
|
||||||
|
- name: Install yarn
|
||||||
|
if: steps.mail-templates.outputs.cache-hit != 'true'
|
||||||
|
run: npm install -g yarn
|
||||||
|
|
||||||
|
- name: Install node dependencies
|
||||||
|
if: steps.mail-templates.outputs.cache-hit != 'true'
|
||||||
|
run: yarn install --frozen-lockfile
|
||||||
|
|
||||||
|
- name: Build mails
|
||||||
|
if: steps.mail-templates.outputs.cache-hit != 'true'
|
||||||
|
run: yarn build
|
||||||
|
|
||||||
|
- name: Cache mail templates
|
||||||
|
if: steps.mail-templates.outputs.cache-hit != 'true'
|
||||||
|
uses: actions/cache@v4
|
||||||
|
with:
|
||||||
|
path: "src/backend/core/templates/mail"
|
||||||
|
key: mail-templates-${{ hashFiles('src/mail/mjml') }}
|
||||||
|
|
||||||
|
lint-back:
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
defaults:
|
||||||
|
run:
|
||||||
|
working-directory: src/backend
|
||||||
|
steps:
|
||||||
|
- name: Checkout repository
|
||||||
|
uses: actions/checkout@v2
|
||||||
|
- name: Install Python
|
||||||
|
uses: actions/setup-python@v3
|
||||||
|
with:
|
||||||
|
python-version: "3.10"
|
||||||
|
- name: Install development dependencies
|
||||||
|
run: pip install --user .[dev]
|
||||||
|
- name: Check code formatting with ruff
|
||||||
|
run: ~/.local/bin/ruff format . --diff
|
||||||
|
- name: Lint code with ruff
|
||||||
|
run: ~/.local/bin/ruff check .
|
||||||
|
- name: Lint code with pylint
|
||||||
|
run: ~/.local/bin/pylint .
|
||||||
|
|
||||||
|
test-back:
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
needs: build-mails
|
||||||
|
|
||||||
|
defaults:
|
||||||
|
run:
|
||||||
|
working-directory: src/backend
|
||||||
|
|
||||||
|
services:
|
||||||
|
postgres:
|
||||||
|
image: postgres:16
|
||||||
|
env:
|
||||||
|
POSTGRES_DB: impress
|
||||||
|
POSTGRES_USER: dinum
|
||||||
|
POSTGRES_PASSWORD: pass
|
||||||
|
ports:
|
||||||
|
- 5432:5432
|
||||||
|
# needed because the postgres container does not provide a healthcheck
|
||||||
|
options: --health-cmd pg_isready --health-interval 10s --health-timeout 5s --health-retries 5
|
||||||
|
|
||||||
|
env:
|
||||||
|
DJANGO_CONFIGURATION: Test
|
||||||
|
DJANGO_SETTINGS_MODULE: impress.settings
|
||||||
|
DJANGO_SECRET_KEY: ThisIsAnExampleKeyForTestPurposeOnly
|
||||||
|
OIDC_OP_JWKS_ENDPOINT: /endpoint-for-test-purpose-only
|
||||||
|
DB_HOST: localhost
|
||||||
|
DB_NAME: impress
|
||||||
|
DB_USER: dinum
|
||||||
|
DB_PASSWORD: pass
|
||||||
|
DB_PORT: 5432
|
||||||
|
STORAGES_STATICFILES_BACKEND: django.contrib.staticfiles.storage.StaticFilesStorage
|
||||||
|
AWS_S3_ENDPOINT_URL: http://localhost:9000
|
||||||
|
AWS_S3_ACCESS_KEY_ID: impress
|
||||||
|
AWS_S3_SECRET_ACCESS_KEY: password
|
||||||
|
|
||||||
|
steps:
|
||||||
|
- name: Checkout repository
|
||||||
|
uses: actions/checkout@v4
|
||||||
|
|
||||||
|
- name: Create writable /data
|
||||||
|
run: |
|
||||||
|
sudo mkdir -p /data/media && \
|
||||||
|
sudo mkdir -p /data/static
|
||||||
|
|
||||||
|
- name: Restore the mail templates
|
||||||
|
uses: actions/cache@v4
|
||||||
|
id: mail-templates
|
||||||
|
with:
|
||||||
|
path: "src/backend/core/templates/mail"
|
||||||
|
key: mail-templates-${{ hashFiles('src/mail/mjml') }}
|
||||||
|
|
||||||
|
- name: Start Minio
|
||||||
|
run: |
|
||||||
|
docker pull minio/minio
|
||||||
|
docker run -d --name minio \
|
||||||
|
-p 9000:9000 \
|
||||||
|
-e "MINIO_ACCESS_KEY=impress" \
|
||||||
|
-e "MINIO_SECRET_KEY=password" \
|
||||||
|
-v /data/media:/data \
|
||||||
|
minio/minio server --console-address :9001 /data
|
||||||
|
|
||||||
|
- name: Configure MinIO
|
||||||
|
run: |
|
||||||
|
MINIO=$(docker ps | grep minio/minio | sed -E 's/.*\s+([a-zA-Z0-9_-]+)$/\1/')
|
||||||
|
docker exec ${MINIO} sh -c \
|
||||||
|
"mc alias set impress http://localhost:9000 impress password && \
|
||||||
|
mc alias ls && \
|
||||||
|
mc mb impress/impress-media-storage && \
|
||||||
|
mc version enable impress/impress-media-storage"
|
||||||
|
|
||||||
|
- name: Install Python
|
||||||
|
uses: actions/setup-python@v3
|
||||||
|
with:
|
||||||
|
python-version: "3.10"
|
||||||
|
|
||||||
|
- name: Install development dependencies
|
||||||
|
run: pip install --user .[dev]
|
||||||
|
|
||||||
|
- name: Install gettext (required to compile messages)
|
||||||
|
run: |
|
||||||
|
sudo apt-get update
|
||||||
|
sudo apt-get install -y gettext
|
||||||
|
|
||||||
|
- name: Generate a MO file from strings extracted from the project
|
||||||
|
run: python manage.py compilemessages
|
||||||
|
|
||||||
|
- name: Run tests
|
||||||
|
run: ~/.local/bin/pytest -n 2
|
||||||
|
|
||||||
|
i18n-crowdin:
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
steps:
|
||||||
|
-
|
||||||
|
uses: actions/create-github-app-token@v1
|
||||||
|
id: app-token
|
||||||
|
with:
|
||||||
|
app-id: ${{ secrets.APP_ID }}
|
||||||
|
private-key: ${{ secrets.PRIVATE_KEY }}
|
||||||
|
owner: ${{ github.repository_owner }}
|
||||||
|
repositories: "infrastructure,secrets"
|
||||||
|
-
|
||||||
|
name: Checkout repository
|
||||||
|
uses: actions/checkout@v2
|
||||||
|
with:
|
||||||
|
submodules: recursive
|
||||||
|
token: ${{ steps.app-token.outputs.token }}
|
||||||
|
-
|
||||||
|
name: Load sops secrets
|
||||||
|
uses: rouja/actions-sops@main
|
||||||
|
with:
|
||||||
|
secret-file: secrets/numerique-gouv/impress/secrets.enc.env
|
||||||
|
age-key: ${{ secrets.SOPS_PRIVATE }}
|
||||||
|
|
||||||
|
- name: Install gettext (required to make messages)
|
||||||
|
run: |
|
||||||
|
sudo apt-get update
|
||||||
|
sudo apt-get install -y gettext
|
||||||
|
|
||||||
|
- name: Install Python
|
||||||
|
uses: actions/setup-python@v3
|
||||||
|
with:
|
||||||
|
python-version: "3.10"
|
||||||
|
|
||||||
|
- name: Install development dependencies
|
||||||
|
working-directory: src/backend
|
||||||
|
run: pip install --user .[dev]
|
||||||
|
|
||||||
|
- name: Generate the translation base file
|
||||||
|
run: ~/.local/bin/django-admin makemessages --keep-pot --all
|
||||||
|
|
||||||
|
- name: Setup Node.js
|
||||||
|
uses: actions/setup-node@v4
|
||||||
|
with:
|
||||||
|
node-version: "18.x"
|
||||||
|
cache: "yarn"
|
||||||
|
cache-dependency-path: src/frontend/yarn.lock
|
||||||
|
|
||||||
|
- name: Install dependencies
|
||||||
|
run: cd src/frontend/ && yarn install --frozen-lockfile
|
||||||
|
|
||||||
|
- name: Extract the frontend translation
|
||||||
|
run: make frontend-i18n-extract
|
||||||
|
|
||||||
|
- name: Upload files to Crowdin
|
||||||
|
run: |
|
||||||
|
docker run \
|
||||||
|
--rm \
|
||||||
|
-e CROWDIN_API_TOKEN=$CROWDIN_API_TOKEN \
|
||||||
|
-e CROWDIN_PROJECT_ID=$CROWDIN_PROJECT_ID \
|
||||||
|
-e CROWDIN_BASE_PATH=$CROWDIN_BASE_PATH \
|
||||||
|
-v "${{ github.workspace }}:/app" \
|
||||||
|
crowdin/cli:3.16.0 \
|
||||||
|
crowdin upload sources -c /app/crowdin/config.yml
|
||||||
81
.gitignore
vendored
Normal file
81
.gitignore
vendored
Normal file
@@ -0,0 +1,81 @@
|
|||||||
|
# Byte-compiled / optimized / DLL files
|
||||||
|
__pycache__/
|
||||||
|
*.py[cod]
|
||||||
|
*$py.class
|
||||||
|
|
||||||
|
# C extensions
|
||||||
|
*.so
|
||||||
|
|
||||||
|
# Distribution / packaging
|
||||||
|
.Python
|
||||||
|
build/
|
||||||
|
develop-eggs/
|
||||||
|
dist/
|
||||||
|
downloads/
|
||||||
|
eggs/
|
||||||
|
.eggs/
|
||||||
|
lib/
|
||||||
|
lib64/
|
||||||
|
parts/
|
||||||
|
sdist/
|
||||||
|
var/
|
||||||
|
wheels/
|
||||||
|
pip-wheel-metadata/
|
||||||
|
share/python-wheels/
|
||||||
|
*.egg-info/
|
||||||
|
.installed.cfg
|
||||||
|
*.egg
|
||||||
|
MANIFEST
|
||||||
|
.DS_Store
|
||||||
|
.next/
|
||||||
|
|
||||||
|
# Translations # Translations
|
||||||
|
*.pot
|
||||||
|
|
||||||
|
# Environments
|
||||||
|
.env
|
||||||
|
.venv
|
||||||
|
env/
|
||||||
|
venv/
|
||||||
|
ENV/
|
||||||
|
env.bak/
|
||||||
|
venv.bak/
|
||||||
|
env.d/development/*
|
||||||
|
!env.d/development/*.dist
|
||||||
|
env.d/terraform
|
||||||
|
|
||||||
|
# npm
|
||||||
|
node_modules
|
||||||
|
|
||||||
|
# Mails
|
||||||
|
src/backend/core/templates/mail/
|
||||||
|
|
||||||
|
# Typescript client
|
||||||
|
src/frontend/tsclient
|
||||||
|
|
||||||
|
# Swagger
|
||||||
|
**/swagger.json
|
||||||
|
|
||||||
|
# Logs
|
||||||
|
*.log
|
||||||
|
|
||||||
|
# Terraform
|
||||||
|
.terraform
|
||||||
|
*.tfstate
|
||||||
|
*.tfstate.backup
|
||||||
|
|
||||||
|
# Test & lint
|
||||||
|
.coverage
|
||||||
|
.pylint.d
|
||||||
|
.pytest_cache
|
||||||
|
db.sqlite3
|
||||||
|
.mypy_cache
|
||||||
|
|
||||||
|
# Site media
|
||||||
|
/data/
|
||||||
|
|
||||||
|
# IDEs
|
||||||
|
.idea/
|
||||||
|
.vscode/
|
||||||
|
*.iml
|
||||||
|
.devcontainer
|
||||||
78
.gitlint
Normal file
78
.gitlint
Normal file
@@ -0,0 +1,78 @@
|
|||||||
|
# All these sections are optional, edit this file as you like.
|
||||||
|
[general]
|
||||||
|
# Ignore certain rules, you can reference them by their id or by their full name
|
||||||
|
# ignore=title-trailing-punctuation, T3
|
||||||
|
|
||||||
|
# verbosity should be a value between 1 and 3, the commandline -v flags take precedence over this
|
||||||
|
# verbosity = 2
|
||||||
|
|
||||||
|
# By default gitlint will ignore merge commits. Set to 'false' to disable.
|
||||||
|
# ignore-merge-commits=true
|
||||||
|
|
||||||
|
# By default gitlint will ignore fixup commits. Set to 'false' to disable.
|
||||||
|
# ignore-fixup-commits=true
|
||||||
|
|
||||||
|
# By default gitlint will ignore squash commits. Set to 'false' to disable.
|
||||||
|
# ignore-squash-commits=true
|
||||||
|
|
||||||
|
# Enable debug mode (prints more output). Disabled by default.
|
||||||
|
# debug=true
|
||||||
|
|
||||||
|
# Set the extra-path where gitlint will search for user defined rules
|
||||||
|
# See http://jorisroovers.github.io/gitlint/user_defined_rules for details
|
||||||
|
extra-path=gitlint/
|
||||||
|
|
||||||
|
# [title-max-length]
|
||||||
|
# line-length=80
|
||||||
|
|
||||||
|
[title-must-not-contain-word]
|
||||||
|
# Comma-separated list of words that should not occur in the title. Matching is case
|
||||||
|
# insensitive. It's fine if the keyword occurs as part of a larger word (so "WIPING"
|
||||||
|
# will not cause a violation, but "WIP: my title" will.
|
||||||
|
words=wip
|
||||||
|
|
||||||
|
#[title-match-regex]
|
||||||
|
# python like regex (https://docs.python.org/2/library/re.html) that the
|
||||||
|
# commit-msg title must be matched to.
|
||||||
|
# Note that the regex can contradict with other rules if not used correctly
|
||||||
|
# (e.g. title-must-not-contain-word).
|
||||||
|
#regex=
|
||||||
|
|
||||||
|
# [B1]
|
||||||
|
# B1 = body-max-line-length
|
||||||
|
# line-length=120
|
||||||
|
# [body-min-length]
|
||||||
|
# min-length=5
|
||||||
|
|
||||||
|
# [body-is-missing]
|
||||||
|
# Whether to ignore this rule on merge commits (which typically only have a title)
|
||||||
|
# default = True
|
||||||
|
# ignore-merge-commits=false
|
||||||
|
|
||||||
|
# [body-changed-file-mention]
|
||||||
|
# List of files that need to be explicitly mentioned in the body when they are changed
|
||||||
|
# This is useful for when developers often erroneously edit certain files or git submodules.
|
||||||
|
# By specifying this rule, developers can only change the file when they explicitly reference
|
||||||
|
# it in the commit message.
|
||||||
|
# files=gitlint/rules.py,README.md
|
||||||
|
|
||||||
|
# [author-valid-email]
|
||||||
|
# python like regex (https://docs.python.org/2/library/re.html) that the
|
||||||
|
# commit author email address should be matched to
|
||||||
|
# For example, use the following regex if you only want to allow email addresses from foo.com
|
||||||
|
# regex=[^@]+@foo.com
|
||||||
|
|
||||||
|
[ignore-by-title]
|
||||||
|
# Allow empty body & wrong title pattern only when bots (pyup/greenkeeper)
|
||||||
|
# upgrade dependencies
|
||||||
|
regex=^(⬆️.*|Update (.*) from (.*) to (.*)|(chore|fix)\(package\): update .*)$
|
||||||
|
ignore=B6,UC1
|
||||||
|
|
||||||
|
# [ignore-by-body]
|
||||||
|
# Ignore certain rules for commits of which the body has a line that matches a regex
|
||||||
|
# E.g. Match bodies that have a line that that contain "release"
|
||||||
|
# regex=(.*)release(.*)
|
||||||
|
#
|
||||||
|
# Ignore certain rules, you can reference them by their id or by their full name
|
||||||
|
# Use 'all' to ignore all rules
|
||||||
|
# ignore=T1,body-min-length
|
||||||
3
.gitmodules
vendored
Normal file
3
.gitmodules
vendored
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
[submodule "secrets"]
|
||||||
|
path = secrets
|
||||||
|
url = ../secrets
|
||||||
52
CHANGELOG.md
Normal file
52
CHANGELOG.md
Normal file
@@ -0,0 +1,52 @@
|
|||||||
|
# Changelog
|
||||||
|
|
||||||
|
All notable changes to this project will be documented in this file.
|
||||||
|
|
||||||
|
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0),
|
||||||
|
and this project adheres to
|
||||||
|
[Semantic Versioning](https://semver.org/spec/v2.0.0.html).
|
||||||
|
|
||||||
|
## [Unreleased]
|
||||||
|
|
||||||
|
## Added
|
||||||
|
|
||||||
|
- Manage the document's right (#75)
|
||||||
|
- Update document (#68)
|
||||||
|
- Remove document (#68)
|
||||||
|
- (docker) dockerize dev frontend (#63)
|
||||||
|
- (backend) list users with email filtering (#79)
|
||||||
|
- (frontend) add user to a document (#52)
|
||||||
|
- (frontend) invite user to a document (#52)
|
||||||
|
- (frontend) manage members (update role / list / remove) (#81)
|
||||||
|
- ✨(frontend) offline mode (#88)
|
||||||
|
|
||||||
|
## Changed
|
||||||
|
|
||||||
|
- Change most of the occurences from pad to document (#76)
|
||||||
|
- Change site from Impress to Docs (#76)
|
||||||
|
- Generate PDF from a modal (#68)
|
||||||
|
- 🔧 (helm) sticky session by request_uri for signaling server (#78)
|
||||||
|
- (frontend) change logo (#84)
|
||||||
|
- (frontend) pdf has title doc (#84)
|
||||||
|
- ⚡️(e2e) unique login between tests (#80)
|
||||||
|
- ⚡️(CI) improve e2e job (#86)
|
||||||
|
- ♻️(frontend) improve the error and message info ui (#93)
|
||||||
|
|
||||||
|
## Fixed
|
||||||
|
|
||||||
|
- Fix the break line when generate PDF (#84)
|
||||||
|
|
||||||
|
## Delete
|
||||||
|
|
||||||
|
- Remove trigger workflow on push tags on CI (#68)
|
||||||
|
|
||||||
|
## [0.1.0] - 2024-05-24
|
||||||
|
|
||||||
|
## Added
|
||||||
|
|
||||||
|
- Coming Soon page (#67)
|
||||||
|
- Impress, project to manage your documents easily and collaboratively.
|
||||||
|
|
||||||
|
|
||||||
|
[unreleased]: https://github.com/numerique-gouv/impress/compare/v0.1.0...main
|
||||||
|
[0.1.0]: https://github.com/numerique-gouv/impress/releases/v0.1.0
|
||||||
150
Dockerfile
Normal file
150
Dockerfile
Normal file
@@ -0,0 +1,150 @@
|
|||||||
|
# Django impress
|
||||||
|
|
||||||
|
# ---- base image to inherit from ----
|
||||||
|
FROM python:3.10-slim-bullseye as base
|
||||||
|
|
||||||
|
# Upgrade pip to its latest release to speed up dependencies installation
|
||||||
|
RUN python -m pip install --upgrade pip
|
||||||
|
|
||||||
|
# Upgrade system packages to install security updates
|
||||||
|
RUN apt-get update && \
|
||||||
|
apt-get -y upgrade && \
|
||||||
|
rm -rf /var/lib/apt/lists/*
|
||||||
|
|
||||||
|
# ---- Back-end builder image ----
|
||||||
|
FROM base as back-builder
|
||||||
|
|
||||||
|
WORKDIR /builder
|
||||||
|
|
||||||
|
# Copy required python dependencies
|
||||||
|
COPY ./src/backend /builder
|
||||||
|
|
||||||
|
RUN mkdir /install && \
|
||||||
|
pip install --prefix=/install .
|
||||||
|
|
||||||
|
|
||||||
|
# ---- mails ----
|
||||||
|
FROM node:20 as mail-builder
|
||||||
|
|
||||||
|
COPY ./src/mail /mail/app
|
||||||
|
|
||||||
|
WORKDIR /mail/app
|
||||||
|
|
||||||
|
RUN yarn install --frozen-lockfile && \
|
||||||
|
yarn build
|
||||||
|
|
||||||
|
|
||||||
|
# ---- static link collector ----
|
||||||
|
FROM base as link-collector
|
||||||
|
ARG IMPRESS_STATIC_ROOT=/data/static
|
||||||
|
|
||||||
|
# Install libpangocairo & rdfind
|
||||||
|
RUN apt-get update && \
|
||||||
|
apt-get install -y \
|
||||||
|
libpangocairo-1.0-0 \
|
||||||
|
rdfind && \
|
||||||
|
rm -rf /var/lib/apt/lists/*
|
||||||
|
|
||||||
|
# Copy installed python dependencies
|
||||||
|
COPY --from=back-builder /install /usr/local
|
||||||
|
|
||||||
|
# Copy impress application (see .dockerignore)
|
||||||
|
COPY ./src/backend /app/
|
||||||
|
|
||||||
|
WORKDIR /app
|
||||||
|
|
||||||
|
# collectstatic
|
||||||
|
RUN DJANGO_CONFIGURATION=Build DJANGO_JWT_PRIVATE_SIGNING_KEY=Dummy \
|
||||||
|
python manage.py collectstatic --noinput
|
||||||
|
|
||||||
|
# Replace duplicated file by a symlink to decrease the overall size of the
|
||||||
|
# final image
|
||||||
|
RUN rdfind -makesymlinks true -followsymlinks true -makeresultsfile false ${IMPRESS_STATIC_ROOT}
|
||||||
|
|
||||||
|
# ---- Core application image ----
|
||||||
|
FROM base as core
|
||||||
|
|
||||||
|
ENV PYTHONUNBUFFERED=1
|
||||||
|
|
||||||
|
# Install required system libs
|
||||||
|
RUN apt-get update && \
|
||||||
|
apt-get install -y \
|
||||||
|
gettext \
|
||||||
|
libcairo2 \
|
||||||
|
libffi-dev \
|
||||||
|
libgdk-pixbuf2.0-0 \
|
||||||
|
libpango-1.0-0 \
|
||||||
|
libpangocairo-1.0-0 \
|
||||||
|
shared-mime-info && \
|
||||||
|
rm -rf /var/lib/apt/lists/*
|
||||||
|
|
||||||
|
# Copy entrypoint
|
||||||
|
COPY ./docker/files/usr/local/bin/entrypoint /usr/local/bin/entrypoint
|
||||||
|
|
||||||
|
# Give the "root" group the same permissions as the "root" user on /etc/passwd
|
||||||
|
# to allow a user belonging to the root group to add new users; typically the
|
||||||
|
# docker user (see entrypoint).
|
||||||
|
RUN chmod g=u /etc/passwd
|
||||||
|
|
||||||
|
# Copy installed python dependencies
|
||||||
|
COPY --from=back-builder /install /usr/local
|
||||||
|
|
||||||
|
# Copy impress application (see .dockerignore)
|
||||||
|
COPY ./src/backend /app/
|
||||||
|
|
||||||
|
WORKDIR /app
|
||||||
|
|
||||||
|
# We wrap commands run in this container by the following entrypoint that
|
||||||
|
# creates a user on-the-fly with the container user ID (see USER) and root group
|
||||||
|
# ID.
|
||||||
|
ENTRYPOINT [ "/usr/local/bin/entrypoint" ]
|
||||||
|
|
||||||
|
# ---- Development image ----
|
||||||
|
FROM core as backend-development
|
||||||
|
|
||||||
|
# Switch back to the root user to install development dependencies
|
||||||
|
USER root:root
|
||||||
|
|
||||||
|
# Install psql
|
||||||
|
RUN apt-get update && \
|
||||||
|
apt-get install -y postgresql-client && \
|
||||||
|
rm -rf /var/lib/apt/lists/*
|
||||||
|
|
||||||
|
# Uninstall impress and re-install it in editable mode along with development
|
||||||
|
# dependencies
|
||||||
|
RUN pip uninstall -y impress
|
||||||
|
RUN pip install -e .[dev]
|
||||||
|
|
||||||
|
# Restore the un-privileged user running the application
|
||||||
|
ARG DOCKER_USER
|
||||||
|
USER ${DOCKER_USER}
|
||||||
|
|
||||||
|
# Target database host (e.g. database engine following docker compose services
|
||||||
|
# name) & port
|
||||||
|
ENV DB_HOST=postgresql \
|
||||||
|
DB_PORT=5432
|
||||||
|
|
||||||
|
# Run django development server
|
||||||
|
CMD ["python", "manage.py", "runserver", "0.0.0.0:8000"]
|
||||||
|
|
||||||
|
# ---- Production image ----
|
||||||
|
FROM core as backend-production
|
||||||
|
|
||||||
|
ARG IMPRESS_STATIC_ROOT=/data/static
|
||||||
|
|
||||||
|
# Gunicorn
|
||||||
|
RUN mkdir -p /usr/local/etc/gunicorn
|
||||||
|
COPY docker/files/usr/local/etc/gunicorn/impress.py /usr/local/etc/gunicorn/impress.py
|
||||||
|
|
||||||
|
# Un-privileged user running the application
|
||||||
|
ARG DOCKER_USER
|
||||||
|
USER ${DOCKER_USER}
|
||||||
|
|
||||||
|
# Copy statics
|
||||||
|
COPY --from=link-collector ${IMPRESS_STATIC_ROOT} ${IMPRESS_STATIC_ROOT}
|
||||||
|
|
||||||
|
# Copy impress mails
|
||||||
|
COPY --from=mail-builder /mail/backend/core/templates/mail /app/core/templates/mail
|
||||||
|
|
||||||
|
# The default command runs gunicorn WSGI server in impress's main module
|
||||||
|
CMD ["gunicorn", "-c", "/usr/local/etc/gunicorn/impress.py", "impress.wsgi:application"]
|
||||||
2
LICENSE
2
LICENSE
@@ -1,6 +1,6 @@
|
|||||||
MIT License
|
MIT License
|
||||||
|
|
||||||
Copyright (c) 2023 Direction Interministérielle du Numérique du Gouvernement Français
|
Copyright (c) 2023 Direction Interministérielle du Numérique - Gouvernement Français
|
||||||
|
|
||||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
of this software and associated documentation files (the "Software"), to deal
|
of this software and associated documentation files (the "Software"), to deal
|
||||||
|
|||||||
323
Makefile
Normal file
323
Makefile
Normal file
@@ -0,0 +1,323 @@
|
|||||||
|
# /!\ /!\ /!\ /!\ /!\ /!\ /!\ DISCLAIMER /!\ /!\ /!\ /!\ /!\ /!\ /!\ /!\
|
||||||
|
#
|
||||||
|
# This Makefile is only meant to be used for DEVELOPMENT purpose as we are
|
||||||
|
# changing the user id that will run in the container.
|
||||||
|
#
|
||||||
|
# PLEASE DO NOT USE IT FOR YOUR CI/PRODUCTION/WHATEVER...
|
||||||
|
#
|
||||||
|
# /!\ /!\ /!\ /!\ /!\ /!\ /!\ /!\ /!\ /!\ /!\ /!\ /!\ /!\ /!\ /!\ /!\ /!\
|
||||||
|
#
|
||||||
|
# Note to developers:
|
||||||
|
#
|
||||||
|
# While editing this file, please respect the following statements:
|
||||||
|
#
|
||||||
|
# 1. Every variable should be defined in the ad hoc VARIABLES section with a
|
||||||
|
# relevant subsection
|
||||||
|
# 2. Every new rule should be defined in the ad hoc RULES section with a
|
||||||
|
# relevant subsection depending on the targeted service
|
||||||
|
# 3. Rules should be sorted alphabetically within their section
|
||||||
|
# 4. When a rule has multiple dependencies, you should:
|
||||||
|
# - duplicate the rule name to add the help string (if required)
|
||||||
|
# - write one dependency per line to increase readability and diffs
|
||||||
|
# 5. .PHONY rule statement should be written after the corresponding rule
|
||||||
|
# ==============================================================================
|
||||||
|
# VARIABLES
|
||||||
|
|
||||||
|
BOLD := \033[1m
|
||||||
|
RESET := \033[0m
|
||||||
|
GREEN := \033[1;32m
|
||||||
|
|
||||||
|
|
||||||
|
# -- Database
|
||||||
|
|
||||||
|
DB_HOST = postgresql
|
||||||
|
DB_PORT = 5432
|
||||||
|
|
||||||
|
# -- Docker
|
||||||
|
# Get the current user ID to use for docker run and docker exec commands
|
||||||
|
DOCKER_UID = $(shell id -u)
|
||||||
|
DOCKER_GID = $(shell id -g)
|
||||||
|
DOCKER_USER = $(DOCKER_UID):$(DOCKER_GID)
|
||||||
|
COMPOSE = DOCKER_USER=$(DOCKER_USER) docker compose
|
||||||
|
COMPOSE_EXEC = $(COMPOSE) exec
|
||||||
|
COMPOSE_EXEC_APP = $(COMPOSE_EXEC) app-dev
|
||||||
|
COMPOSE_RUN = $(COMPOSE) run --rm
|
||||||
|
COMPOSE_RUN_APP = $(COMPOSE_RUN) app-dev
|
||||||
|
COMPOSE_RUN_CROWDIN = $(COMPOSE_RUN) crowdin crowdin
|
||||||
|
WAIT_DB = @$(COMPOSE_RUN) dockerize -wait tcp://$(DB_HOST):$(DB_PORT) -timeout 60s
|
||||||
|
|
||||||
|
# -- Backend
|
||||||
|
MANAGE = $(COMPOSE_RUN_APP) python manage.py
|
||||||
|
MAIL_YARN = $(COMPOSE_RUN) -w /app/src/mail node yarn
|
||||||
|
TSCLIENT_YARN = $(COMPOSE_RUN) -w /app/src/tsclient node yarn
|
||||||
|
|
||||||
|
# -- Frontend
|
||||||
|
PATH_FRONT = ./src/frontend
|
||||||
|
|
||||||
|
# ==============================================================================
|
||||||
|
# RULES
|
||||||
|
|
||||||
|
default: help
|
||||||
|
|
||||||
|
data/media:
|
||||||
|
@mkdir -p data/media
|
||||||
|
|
||||||
|
data/static:
|
||||||
|
@mkdir -p data/static
|
||||||
|
|
||||||
|
# -- Project
|
||||||
|
|
||||||
|
create-env-files: ## Copy the dist env files to env files
|
||||||
|
create-env-files: \
|
||||||
|
env.d/development/common \
|
||||||
|
env.d/development/crowdin \
|
||||||
|
env.d/development/postgresql \
|
||||||
|
env.d/development/kc_postgresql
|
||||||
|
.PHONY: create-env-files
|
||||||
|
|
||||||
|
bootstrap: ## Prepare Docker images for the project
|
||||||
|
bootstrap: \
|
||||||
|
data/media \
|
||||||
|
data/static \
|
||||||
|
create-env-files \
|
||||||
|
build \
|
||||||
|
migrate \
|
||||||
|
demo \
|
||||||
|
back-i18n-compile \
|
||||||
|
mails-install \
|
||||||
|
mails-build
|
||||||
|
.PHONY: bootstrap
|
||||||
|
|
||||||
|
# -- Docker/compose
|
||||||
|
build: ## build the app-dev container
|
||||||
|
@$(COMPOSE) build app-dev --no-cache
|
||||||
|
.PHONY: build
|
||||||
|
|
||||||
|
down: ## stop and remove containers, networks, images, and volumes
|
||||||
|
@$(COMPOSE) down
|
||||||
|
.PHONY: down
|
||||||
|
|
||||||
|
logs: ## display app-dev logs (follow mode)
|
||||||
|
@$(COMPOSE) logs -f app-dev
|
||||||
|
.PHONY: logs
|
||||||
|
|
||||||
|
run: ## start the wsgi (production) and development server
|
||||||
|
@$(COMPOSE) up --force-recreate -d celery-dev
|
||||||
|
@echo "Wait for postgresql to be up..."
|
||||||
|
@$(WAIT_DB)
|
||||||
|
.PHONY: run
|
||||||
|
|
||||||
|
status: ## an alias for "docker compose ps"
|
||||||
|
@$(COMPOSE) ps
|
||||||
|
.PHONY: status
|
||||||
|
|
||||||
|
stop: ## stop the development server using Docker
|
||||||
|
@$(COMPOSE) stop
|
||||||
|
.PHONY: stop
|
||||||
|
|
||||||
|
# -- Backend
|
||||||
|
|
||||||
|
demo: ## flush db then create a demo for load testing purpose
|
||||||
|
@$(MAKE) resetdb
|
||||||
|
@$(MANAGE) create_demo
|
||||||
|
.PHONY: demo
|
||||||
|
|
||||||
|
# Nota bene: Black should come after isort just in case they don't agree...
|
||||||
|
lint: ## lint back-end python sources
|
||||||
|
lint: \
|
||||||
|
lint-ruff-format \
|
||||||
|
lint-ruff-check \
|
||||||
|
lint-pylint
|
||||||
|
.PHONY: lint
|
||||||
|
|
||||||
|
lint-ruff-format: ## format back-end python sources with ruff
|
||||||
|
@echo 'lint:ruff-format started…'
|
||||||
|
@$(COMPOSE_RUN_APP) ruff format .
|
||||||
|
.PHONY: lint-ruff-format
|
||||||
|
|
||||||
|
lint-ruff-check: ## lint back-end python sources with ruff
|
||||||
|
@echo 'lint:ruff-check started…'
|
||||||
|
@$(COMPOSE_RUN_APP) ruff check . --fix
|
||||||
|
.PHONY: lint-ruff-check
|
||||||
|
|
||||||
|
lint-pylint: ## lint back-end python sources with pylint only on changed files from main
|
||||||
|
@echo 'lint:pylint started…'
|
||||||
|
bin/pylint --diff-only=origin/main
|
||||||
|
.PHONY: lint-pylint
|
||||||
|
|
||||||
|
test: ## run project tests
|
||||||
|
@$(MAKE) test-back-parallel
|
||||||
|
.PHONY: test
|
||||||
|
|
||||||
|
test-back: ## run back-end tests
|
||||||
|
@args="$(filter-out $@,$(MAKECMDGOALS))" && \
|
||||||
|
bin/pytest $${args:-${1}}
|
||||||
|
.PHONY: test-back
|
||||||
|
|
||||||
|
test-back-parallel: ## run all back-end tests in parallel
|
||||||
|
@args="$(filter-out $@,$(MAKECMDGOALS))" && \
|
||||||
|
bin/pytest -n auto $${args:-${1}}
|
||||||
|
.PHONY: test-back-parallel
|
||||||
|
|
||||||
|
makemigrations: ## run django makemigrations for the impress project.
|
||||||
|
@echo "$(BOLD)Running makemigrations$(RESET)"
|
||||||
|
@$(COMPOSE) up -d postgresql
|
||||||
|
@$(WAIT_DB)
|
||||||
|
@$(MANAGE) makemigrations
|
||||||
|
.PHONY: makemigrations
|
||||||
|
|
||||||
|
migrate: ## run django migrations for the impress project.
|
||||||
|
@echo "$(BOLD)Running migrations$(RESET)"
|
||||||
|
@$(COMPOSE) up -d postgresql
|
||||||
|
@$(WAIT_DB)
|
||||||
|
@$(MANAGE) migrate
|
||||||
|
.PHONY: migrate
|
||||||
|
|
||||||
|
superuser: ## Create an admin superuser with password "admin"
|
||||||
|
@echo "$(BOLD)Creating a Django superuser$(RESET)"
|
||||||
|
@$(MANAGE) createsuperuser --email admin@example.com --password admin
|
||||||
|
.PHONY: superuser
|
||||||
|
|
||||||
|
back-i18n-compile: ## compile the gettext files
|
||||||
|
@$(MANAGE) compilemessages --ignore="venv/**/*"
|
||||||
|
.PHONY: back-i18n-compile
|
||||||
|
|
||||||
|
back-i18n-generate: ## create the .pot files used for i18n
|
||||||
|
@$(MANAGE) makemessages -a --keep-pot
|
||||||
|
.PHONY: back-i18n-generate
|
||||||
|
|
||||||
|
shell: ## connect to database shell
|
||||||
|
@$(MANAGE) shell #_plus
|
||||||
|
.PHONY: dbshell
|
||||||
|
|
||||||
|
# -- Database
|
||||||
|
|
||||||
|
dbshell: ## connect to database shell
|
||||||
|
docker compose exec app-dev python manage.py dbshell
|
||||||
|
.PHONY: dbshell
|
||||||
|
|
||||||
|
resetdb: FLUSH_ARGS ?=
|
||||||
|
resetdb: ## flush database and create a superuser "admin"
|
||||||
|
@echo "$(BOLD)Flush database$(RESET)"
|
||||||
|
@$(MANAGE) flush $(FLUSH_ARGS)
|
||||||
|
@${MAKE} superuser
|
||||||
|
.PHONY: resetdb
|
||||||
|
|
||||||
|
env.d/development/common:
|
||||||
|
cp -n env.d/development/common.dist env.d/development/common
|
||||||
|
|
||||||
|
env.d/development/postgresql:
|
||||||
|
cp -n env.d/development/postgresql.dist env.d/development/postgresql
|
||||||
|
|
||||||
|
env.d/development/kc_postgresql:
|
||||||
|
cp -n env.d/development/kc_postgresql.dist env.d/development/kc_postgresql
|
||||||
|
|
||||||
|
# -- Internationalization
|
||||||
|
|
||||||
|
env.d/development/crowdin:
|
||||||
|
cp -n env.d/development/crowdin.dist env.d/development/crowdin
|
||||||
|
|
||||||
|
crowdin-download: ## Download translated message from crowdin
|
||||||
|
@$(COMPOSE_RUN_CROWDIN) download -c crowdin/config.yml
|
||||||
|
.PHONY: crowdin-download
|
||||||
|
|
||||||
|
crowdin-download-sources: ## Download sources from Crowdin
|
||||||
|
@$(COMPOSE_RUN_CROWDIN) download sources -c crowdin/config.yml
|
||||||
|
.PHONY: crowdin-download-sources
|
||||||
|
|
||||||
|
crowdin-upload: ## Upload source translations to crowdin
|
||||||
|
@$(COMPOSE_RUN_CROWDIN) upload sources -c crowdin/config.yml
|
||||||
|
.PHONY: crowdin-upload
|
||||||
|
|
||||||
|
i18n-compile: ## compile all translations
|
||||||
|
i18n-compile: \
|
||||||
|
back-i18n-compile \
|
||||||
|
frontend-i18n-compile
|
||||||
|
.PHONY: i18n-compile
|
||||||
|
|
||||||
|
i18n-generate: ## create the .pot files and extract frontend messages
|
||||||
|
i18n-generate: \
|
||||||
|
back-i18n-generate \
|
||||||
|
frontend-i18n-generate
|
||||||
|
.PHONY: i18n-generate
|
||||||
|
|
||||||
|
i18n-download-and-compile: ## download all translated messages and compile them to be used by all applications
|
||||||
|
i18n-download-and-compile: \
|
||||||
|
crowdin-download \
|
||||||
|
i18n-compile
|
||||||
|
.PHONY: i18n-download-and-compile
|
||||||
|
|
||||||
|
i18n-generate-and-upload: ## generate source translations for all applications and upload them to Crowdin
|
||||||
|
i18n-generate-and-upload: \
|
||||||
|
i18n-generate \
|
||||||
|
crowdin-upload
|
||||||
|
.PHONY: i18n-generate-and-upload
|
||||||
|
|
||||||
|
|
||||||
|
# -- Mail generator
|
||||||
|
|
||||||
|
mails-build: ## Convert mjml files to html and text
|
||||||
|
@$(MAIL_YARN) build
|
||||||
|
.PHONY: mails-build
|
||||||
|
|
||||||
|
mails-build-html-to-plain-text: ## Convert html files to text
|
||||||
|
@$(MAIL_YARN) build-html-to-plain-text
|
||||||
|
.PHONY: mails-build-html-to-plain-text
|
||||||
|
|
||||||
|
mails-build-mjml-to-html: ## Convert mjml files to html and text
|
||||||
|
@$(MAIL_YARN) build-mjml-to-html
|
||||||
|
.PHONY: mails-build-mjml-to-html
|
||||||
|
|
||||||
|
mails-install: ## install the mail generator
|
||||||
|
@$(MAIL_YARN) install
|
||||||
|
.PHONY: mails-install
|
||||||
|
|
||||||
|
# -- TS client generator
|
||||||
|
|
||||||
|
# FIXME : adapt this command
|
||||||
|
tsclient-install: ## Install the Typescript API client generator
|
||||||
|
@$(TSCLIENT_YARN) install
|
||||||
|
.PHONY: tsclient-install
|
||||||
|
|
||||||
|
# FIXME : adapt this command
|
||||||
|
tsclient: tsclient-install ## Generate a Typescript API client
|
||||||
|
@$(TSCLIENT_YARN) generate:api:client:local ../frontend/tsclient
|
||||||
|
.PHONY: tsclient-install
|
||||||
|
|
||||||
|
# -- Misc
|
||||||
|
clean: ## restore repository state as it was freshly cloned
|
||||||
|
git clean -idx
|
||||||
|
.PHONY: clean
|
||||||
|
|
||||||
|
help:
|
||||||
|
@echo "$(BOLD)impress Makefile"
|
||||||
|
@echo "Please use 'make $(BOLD)target$(RESET)' where $(BOLD)target$(RESET) is one of:"
|
||||||
|
@grep -E '^[a-zA-Z0-9_-]+:.*?## .*$$' $(firstword $(MAKEFILE_LIST)) | sort | awk 'BEGIN {FS = ":.*?## "}; {printf "$(GREEN)%-30s$(RESET) %s\n", $$1, $$2}'
|
||||||
|
.PHONY: help
|
||||||
|
|
||||||
|
# FIXME : adapt this command
|
||||||
|
frontend-i18n-extract: ## Extract the frontend translation inside a json to be used for crowdin
|
||||||
|
cd $(PATH_FRONT) && yarn i18n:extract
|
||||||
|
.PHONY: frontend-i18n-extract
|
||||||
|
|
||||||
|
# FIXME : adapt this command
|
||||||
|
frontend-i18n-generate: ## Generate the frontend json files used for crowdin
|
||||||
|
frontend-i18n-generate: \
|
||||||
|
crowdin-download-sources \
|
||||||
|
frontend-i18n-extract
|
||||||
|
.PHONY: frontend-i18n-generate
|
||||||
|
|
||||||
|
# FIXME : adapt this command
|
||||||
|
frontend-i18n-compile: ## Format the crowin json files used deploy to the apps
|
||||||
|
cd $(PATH_FRONT) && yarn i18n:deploy
|
||||||
|
.PHONY: frontend-i18n-compile
|
||||||
|
|
||||||
|
# -- K8S
|
||||||
|
build-k8s-cluster: ## build the kubernetes cluster using kind
|
||||||
|
./bin/start-kind.sh
|
||||||
|
.PHONY: build-k8s-cluster
|
||||||
|
|
||||||
|
start-tilt: ## start the kubernetes cluster using kind
|
||||||
|
tilt up -f ./bin/Tiltfile
|
||||||
|
.PHONY: build-k8s-cluster
|
||||||
|
|
||||||
82
README.md
Normal file
82
README.md
Normal file
@@ -0,0 +1,82 @@
|
|||||||
|
# Impress
|
||||||
|
|
||||||
|
Impress prints your markdown to pdf from predefined templates with user and role based access rights.
|
||||||
|
|
||||||
|
Impress is built on top of [Django Rest
|
||||||
|
Framework](https://www.django-rest-framework.org/) and [Next.js](https://nextjs.org/).
|
||||||
|
|
||||||
|
## Getting started
|
||||||
|
|
||||||
|
### Prerequisite
|
||||||
|
|
||||||
|
Make sure you have a recent version of Docker and [Docker
|
||||||
|
Compose](https://docs.docker.com/compose/install) installed on your laptop:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
$ docker -v
|
||||||
|
Docker version 20.10.2, build 2291f61
|
||||||
|
|
||||||
|
$ docker compose -v
|
||||||
|
docker compose version 1.27.4, build 40524192
|
||||||
|
```
|
||||||
|
|
||||||
|
> ⚠️ You may need to run the following commands with `sudo` but this can be
|
||||||
|
> avoided by assigning your user to the `docker` group.
|
||||||
|
|
||||||
|
### Project bootstrap
|
||||||
|
|
||||||
|
The easiest way to start working on the project is to use GNU Make:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
$ make bootstrap FLUSH_ARGS='--no-input'
|
||||||
|
```
|
||||||
|
|
||||||
|
Then you can access to the project in development mode by going to http://localhost:3000.
|
||||||
|
You will be prompted to log in, the default credentials are:
|
||||||
|
```bash
|
||||||
|
username: impress
|
||||||
|
password: impress
|
||||||
|
```
|
||||||
|
---
|
||||||
|
|
||||||
|
This command builds the `app` container, installs dependencies, performs
|
||||||
|
database migrations and compile translations. It's a good idea to use this
|
||||||
|
command each time you are pulling code from the project repository to avoid
|
||||||
|
dependency-related or migration-related issues.
|
||||||
|
|
||||||
|
Your Docker services should now be up and running 🎉
|
||||||
|
|
||||||
|
[FIXME] Explain how to run the frontend project.
|
||||||
|
|
||||||
|
### Adding content
|
||||||
|
|
||||||
|
You can create a basic demo site by running:
|
||||||
|
|
||||||
|
$ make demo
|
||||||
|
|
||||||
|
Finally, you can check all available Make rules using:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
$ make help
|
||||||
|
```
|
||||||
|
|
||||||
|
### Django admin
|
||||||
|
|
||||||
|
You can access the Django admin site at
|
||||||
|
[http://localhost:8071/admin](http://localhost:8071/admin).
|
||||||
|
|
||||||
|
You first need to create a superuser account:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
$ make superuser
|
||||||
|
```
|
||||||
|
|
||||||
|
## Contributing
|
||||||
|
|
||||||
|
This project is intended to be community-driven, so please, do not hesitate to
|
||||||
|
get in touch if you have any question related to our implementation or design
|
||||||
|
decisions.
|
||||||
|
|
||||||
|
## License
|
||||||
|
|
||||||
|
This work is released under the MIT License (see [LICENSE](./LICENSE)).
|
||||||
17
UPGRADE.md
Normal file
17
UPGRADE.md
Normal file
@@ -0,0 +1,17 @@
|
|||||||
|
# Upgrade
|
||||||
|
|
||||||
|
All instructions to upgrade this project from one release to the next will be
|
||||||
|
documented in this file. Upgrades must be run sequentially, meaning you should
|
||||||
|
not skip minor/major releases while upgrading (fix releases can be skipped).
|
||||||
|
|
||||||
|
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
|
||||||
|
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
|
||||||
|
|
||||||
|
For most upgrades, you just need to run the django migrations with
|
||||||
|
the following command inside your docker container:
|
||||||
|
|
||||||
|
`python manage.py migrate`
|
||||||
|
|
||||||
|
(Note : in your development environment, you can `make migrate`.)
|
||||||
|
|
||||||
|
## [Unreleased]
|
||||||
68
bin/Tiltfile
Normal file
68
bin/Tiltfile
Normal file
@@ -0,0 +1,68 @@
|
|||||||
|
load('ext://uibutton', 'cmd_button', 'bool_input', 'location')
|
||||||
|
load('ext://namespace', 'namespace_create', 'namespace_inject')
|
||||||
|
namespace_create('impress')
|
||||||
|
|
||||||
|
docker_build(
|
||||||
|
'localhost:5001/impress-backend:latest',
|
||||||
|
context='..',
|
||||||
|
dockerfile='../Dockerfile',
|
||||||
|
only=['./src/backend', './src/mail', './docker'],
|
||||||
|
target = 'backend-production',
|
||||||
|
live_update=[
|
||||||
|
sync('../src/backend', '/app'),
|
||||||
|
run(
|
||||||
|
'pip install -r /app/requirements.txt',
|
||||||
|
trigger=['./api/requirements.txt']
|
||||||
|
)
|
||||||
|
]
|
||||||
|
)
|
||||||
|
|
||||||
|
docker_build(
|
||||||
|
'localhost:5001/impress-y-webrtc-signaling:latest',
|
||||||
|
context='..',
|
||||||
|
dockerfile='../src/frontend/Dockerfile',
|
||||||
|
only=['./src/frontend/', './docker/', './dockerignore'],
|
||||||
|
target = 'y-webrtc-signaling',
|
||||||
|
live_update=[
|
||||||
|
sync('../src/frontend/apps/y-webrtc-signaling/src', '/home/frontend/apps/y-webrtc-signaling/src'),
|
||||||
|
]
|
||||||
|
)
|
||||||
|
|
||||||
|
docker_build(
|
||||||
|
'localhost:5001/impress-frontend:latest',
|
||||||
|
context='..',
|
||||||
|
dockerfile='../src/frontend/Dockerfile',
|
||||||
|
only=['./src/frontend', './docker', './dockerignore'],
|
||||||
|
target = 'impress',
|
||||||
|
live_update=[
|
||||||
|
sync('../src/frontend', '/home/frontend'),
|
||||||
|
]
|
||||||
|
)
|
||||||
|
|
||||||
|
k8s_yaml(local('cd ../src/helm && helmfile -n impress -e dev template .'))
|
||||||
|
|
||||||
|
migration = '''
|
||||||
|
set -eu
|
||||||
|
# get k8s pod name from tilt resource name
|
||||||
|
POD_NAME="$(tilt get kubernetesdiscovery impress-backend -ojsonpath='{.status.pods[0].name}')"
|
||||||
|
kubectl -n impress exec "$POD_NAME" -- python manage.py makemigrations
|
||||||
|
'''
|
||||||
|
cmd_button('Make migration',
|
||||||
|
argv=['sh', '-c', migration],
|
||||||
|
resource='impress-backend',
|
||||||
|
icon_name='developer_board',
|
||||||
|
text='Run makemigration',
|
||||||
|
)
|
||||||
|
|
||||||
|
pod_migrate = '''
|
||||||
|
set -eu
|
||||||
|
# get k8s pod name from tilt resource name
|
||||||
|
POD_NAME="$(tilt get kubernetesdiscovery impress-backend -ojsonpath='{.status.pods[0].name}')"
|
||||||
|
kubectl -n impress exec "$POD_NAME" -- python manage.py migrate --no-input
|
||||||
|
'''
|
||||||
|
cmd_button('Migrate db',
|
||||||
|
argv=['sh', '-c', pod_migrate],
|
||||||
|
resource='impress-backend',
|
||||||
|
icon_name='developer_board',
|
||||||
|
text='Run database migration',
|
||||||
|
)
|
||||||
157
bin/_config.sh
Normal file
157
bin/_config.sh
Normal file
@@ -0,0 +1,157 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
|
||||||
|
set -eo pipefail
|
||||||
|
|
||||||
|
REPO_DIR="$(cd "$( dirname "${BASH_SOURCE[0]}" )/.." && pwd)"
|
||||||
|
UNSET_USER=0
|
||||||
|
|
||||||
|
TERRAFORM_DIRECTORY="./env.d/terraform"
|
||||||
|
COMPOSE_FILE="${REPO_DIR}/docker-compose.yml"
|
||||||
|
COMPOSE_PROJECT="impress"
|
||||||
|
|
||||||
|
|
||||||
|
# _set_user: set (or unset) default user id used to run docker commands
|
||||||
|
#
|
||||||
|
# usage: _set_user
|
||||||
|
#
|
||||||
|
# You can override default user ID (the current host user ID), by defining the
|
||||||
|
# USER_ID environment variable.
|
||||||
|
#
|
||||||
|
# To avoid running docker commands with a custom user, please set the
|
||||||
|
# $UNSET_USER environment variable to 1.
|
||||||
|
function _set_user() {
|
||||||
|
|
||||||
|
if [ $UNSET_USER -eq 1 ]; then
|
||||||
|
USER_ID=""
|
||||||
|
return
|
||||||
|
fi
|
||||||
|
|
||||||
|
# USER_ID = USER_ID or `id -u` if USER_ID is not set
|
||||||
|
USER_ID=${USER_ID:-$(id -u)}
|
||||||
|
|
||||||
|
echo "🙋(user) ID: ${USER_ID}"
|
||||||
|
}
|
||||||
|
|
||||||
|
# docker_compose: wrap docker compose command
|
||||||
|
#
|
||||||
|
# usage: docker_compose [options] [ARGS...]
|
||||||
|
#
|
||||||
|
# options: docker compose command options
|
||||||
|
# ARGS : docker compose command arguments
|
||||||
|
function _docker_compose() {
|
||||||
|
|
||||||
|
echo "🐳(compose) project: '${COMPOSE_PROJECT}' file: '${COMPOSE_FILE}'"
|
||||||
|
docker compose \
|
||||||
|
-p "${COMPOSE_PROJECT}" \
|
||||||
|
-f "${COMPOSE_FILE}" \
|
||||||
|
--project-directory "${REPO_DIR}" \
|
||||||
|
"$@"
|
||||||
|
}
|
||||||
|
|
||||||
|
# _dc_run: wrap docker compose run command
|
||||||
|
#
|
||||||
|
# usage: _dc_run [options] [ARGS...]
|
||||||
|
#
|
||||||
|
# options: docker compose run command options
|
||||||
|
# ARGS : docker compose run command arguments
|
||||||
|
function _dc_run() {
|
||||||
|
_set_user
|
||||||
|
|
||||||
|
user_args="--user=$USER_ID"
|
||||||
|
if [ -z $USER_ID ]; then
|
||||||
|
user_args=""
|
||||||
|
fi
|
||||||
|
|
||||||
|
_docker_compose run --rm $user_args "$@"
|
||||||
|
}
|
||||||
|
|
||||||
|
# _dc_exec: wrap docker compose exec command
|
||||||
|
#
|
||||||
|
# usage: _dc_exec [options] [ARGS...]
|
||||||
|
#
|
||||||
|
# options: docker compose exec command options
|
||||||
|
# ARGS : docker compose exec command arguments
|
||||||
|
function _dc_exec() {
|
||||||
|
_set_user
|
||||||
|
|
||||||
|
echo "🐳(compose) exec command: '\$@'"
|
||||||
|
|
||||||
|
user_args="--user=$USER_ID"
|
||||||
|
if [ -z $USER_ID ]; then
|
||||||
|
user_args=""
|
||||||
|
fi
|
||||||
|
|
||||||
|
_docker_compose exec $user_args "$@"
|
||||||
|
}
|
||||||
|
|
||||||
|
# _django_manage: wrap django's manage.py command with docker compose
|
||||||
|
#
|
||||||
|
# usage : _django_manage [ARGS...]
|
||||||
|
#
|
||||||
|
# ARGS : django's manage.py command arguments
|
||||||
|
function _django_manage() {
|
||||||
|
_dc_run "app-dev" python manage.py "$@"
|
||||||
|
}
|
||||||
|
|
||||||
|
# _set_openstack_project: select an OpenStack project from the openrc files defined in the
|
||||||
|
# terraform directory.
|
||||||
|
#
|
||||||
|
# usage: _set_openstack_project
|
||||||
|
#
|
||||||
|
# If necessary the script will prompt the user to choose a project from those available
|
||||||
|
function _set_openstack_project() {
|
||||||
|
|
||||||
|
declare prompt
|
||||||
|
declare -a projects
|
||||||
|
declare -i default=1
|
||||||
|
declare -i choice=0
|
||||||
|
declare -i n_projects
|
||||||
|
|
||||||
|
# List projects by looking in the "./env.d/terraform" directory
|
||||||
|
# and store them in an array
|
||||||
|
read -r -a projects <<< "$(
|
||||||
|
find "${TERRAFORM_DIRECTORY}" -maxdepth 1 -mindepth 1 -type d |
|
||||||
|
sed 's|'"${TERRAFORM_DIRECTORY}\/"'||' |
|
||||||
|
xargs
|
||||||
|
)"
|
||||||
|
nb_projects=${#projects[@]}
|
||||||
|
|
||||||
|
if [[ ${nb_projects} -le 0 ]]; then
|
||||||
|
echo "There are no OpenStack projects defined..." >&2
|
||||||
|
echo "To add one, create a subdirectory in \"${TERRAFORM_DIRECTORY}\" with the name" \
|
||||||
|
"of your project and copy your \"openrc.sh\" file into it." >&2
|
||||||
|
exit 10
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [[ ${nb_projects} -gt 1 ]]; then
|
||||||
|
prompt="Select an OpenStack project to target:\\n"
|
||||||
|
for (( i=0; i<nb_projects; i++ )); do
|
||||||
|
prompt+="[$((i+1))] ${projects[$i]}"
|
||||||
|
if [[ $((i+1)) -eq ${default} ]]; then
|
||||||
|
prompt+=" (default)"
|
||||||
|
fi
|
||||||
|
prompt+="\\n"
|
||||||
|
done
|
||||||
|
prompt+="If your OpenStack project is not listed, add it to the \"env.d/terraform\" directory.\\n"
|
||||||
|
prompt+="Your choice: "
|
||||||
|
read -r -p "$(echo -e "${prompt}")" choice
|
||||||
|
|
||||||
|
if [[ ${choice} -gt nb_projects ]]; then
|
||||||
|
(>&2 echo "Invalid choice ${choice} (should be <= ${nb_projects})")
|
||||||
|
exit 11
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [[ ${choice} -le 0 ]]; then
|
||||||
|
choice=${default}
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
project=${projects[$((choice-1))]}
|
||||||
|
# Check that the openrc.sh file exists for this project
|
||||||
|
if [ ! -f "${TERRAFORM_DIRECTORY}/${project}/openrc.sh" ]; then
|
||||||
|
(>&2 echo "Missing \"openrc.sh\" file in \"${TERRAFORM_DIRECTORY}/${project}\". Check documentation.")
|
||||||
|
exit 12
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo "${project}"
|
||||||
|
}
|
||||||
6
bin/compose
Executable file
6
bin/compose
Executable file
@@ -0,0 +1,6 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
|
||||||
|
# shellcheck source=bin/_config.sh
|
||||||
|
source "$(dirname "${BASH_SOURCE[0]}")/_config.sh"
|
||||||
|
|
||||||
|
_docker_compose "$@"
|
||||||
6
bin/manage
Executable file
6
bin/manage
Executable file
@@ -0,0 +1,6 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
|
||||||
|
# shellcheck source=bin/_config.sh
|
||||||
|
source "$(dirname "${BASH_SOURCE[0]}")/_config.sh"
|
||||||
|
|
||||||
|
_django_manage "$@"
|
||||||
38
bin/pylint
Executable file
38
bin/pylint
Executable file
@@ -0,0 +1,38 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
|
||||||
|
# shellcheck source=bin/_config.sh
|
||||||
|
source "$(dirname "${BASH_SOURCE[0]}")/_config.sh"
|
||||||
|
|
||||||
|
declare diff_from
|
||||||
|
declare -a paths
|
||||||
|
declare -a args
|
||||||
|
|
||||||
|
# Parse options
|
||||||
|
for arg in "$@"
|
||||||
|
do
|
||||||
|
case $arg in
|
||||||
|
--diff-only=*)
|
||||||
|
diff_from="${arg#*=}"
|
||||||
|
shift
|
||||||
|
;;
|
||||||
|
-*)
|
||||||
|
args+=("$arg")
|
||||||
|
shift
|
||||||
|
;;
|
||||||
|
*)
|
||||||
|
paths+=("$arg")
|
||||||
|
shift
|
||||||
|
;;
|
||||||
|
esac
|
||||||
|
done
|
||||||
|
|
||||||
|
if [[ -n "${diff_from}" ]]; then
|
||||||
|
# Run pylint only on modified files located in src/backend
|
||||||
|
# (excluding deleted files and migration files)
|
||||||
|
# shellcheck disable=SC2207
|
||||||
|
paths=($(git diff "${diff_from}" --name-only --diff-filter=d -- src/backend ':!**/migrations/*.py' | grep -E '^src/backend/.*\.py$'))
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Fix docker vs local path when project sources are mounted as a volume
|
||||||
|
read -ra paths <<< "$(echo "${paths[@]}" | sed "s|src/backend/||g")"
|
||||||
|
_dc_run app-dev pylint "${paths[@]}" "${args[@]}"
|
||||||
8
bin/pytest
Executable file
8
bin/pytest
Executable file
@@ -0,0 +1,8 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
|
||||||
|
source "$(dirname "${BASH_SOURCE[0]}")/_config.sh"
|
||||||
|
|
||||||
|
_dc_run \
|
||||||
|
-e DJANGO_CONFIGURATION=Test \
|
||||||
|
app-dev \
|
||||||
|
pytest "$@"
|
||||||
103
bin/start-kind.sh
Normal file
103
bin/start-kind.sh
Normal file
@@ -0,0 +1,103 @@
|
|||||||
|
#!/bin/sh
|
||||||
|
set -o errexit
|
||||||
|
|
||||||
|
CURRENT_DIR=$(pwd)
|
||||||
|
|
||||||
|
echo "0. Create ca"
|
||||||
|
# 0. Create ca
|
||||||
|
mkcert -install
|
||||||
|
cd /tmp
|
||||||
|
mkcert "127.0.0.1.nip.io" "*.127.0.0.1.nip.io"
|
||||||
|
cd $CURRENT_DIR
|
||||||
|
|
||||||
|
echo "1. Create registry container unless it already exists"
|
||||||
|
# 1. Create registry container unless it already exists
|
||||||
|
reg_name='kind-registry'
|
||||||
|
reg_port='5001'
|
||||||
|
if [ "$(docker inspect -f '{{.State.Running}}' "${reg_name}" 2>/dev/null || true)" != 'true' ]; then
|
||||||
|
docker run \
|
||||||
|
-d --restart=always -p "127.0.0.1:${reg_port}:5000" --network bridge --name "${reg_name}" \
|
||||||
|
registry:2
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo "2. Create kind cluster with containerd registry config dir enabled"
|
||||||
|
# 2. Create kind cluster with containerd registry config dir enabled
|
||||||
|
# TODO: kind will eventually enable this by default and this patch will
|
||||||
|
# be unnecessary.
|
||||||
|
#
|
||||||
|
# See:
|
||||||
|
# https://github.com/kubernetes-sigs/kind/issues/2875
|
||||||
|
# https://github.com/containerd/containerd/blob/main/docs/cri/config.md#registry-configuration
|
||||||
|
# See: https://github.com/containerd/containerd/blob/main/docs/hosts.md
|
||||||
|
cat <<EOF | kind create cluster --config=-
|
||||||
|
kind: Cluster
|
||||||
|
apiVersion: kind.x-k8s.io/v1alpha4
|
||||||
|
containerdConfigPatches:
|
||||||
|
- |-
|
||||||
|
[plugins."io.containerd.grpc.v1.cri".registry]
|
||||||
|
config_path = "/etc/containerd/certs.d"
|
||||||
|
nodes:
|
||||||
|
- role: control-plane
|
||||||
|
image: kindest/node:v1.27.3
|
||||||
|
kubeadmConfigPatches:
|
||||||
|
- |
|
||||||
|
kind: InitConfiguration
|
||||||
|
nodeRegistration:
|
||||||
|
kubeletExtraArgs:
|
||||||
|
node-labels: "ingress-ready=true"
|
||||||
|
extraPortMappings:
|
||||||
|
- containerPort: 80
|
||||||
|
hostPort: 80
|
||||||
|
protocol: TCP
|
||||||
|
- containerPort: 443
|
||||||
|
hostPort: 443
|
||||||
|
protocol: TCP
|
||||||
|
- role: worker
|
||||||
|
image: kindest/node:v1.27.3
|
||||||
|
- role: worker
|
||||||
|
image: kindest/node:v1.27.3
|
||||||
|
EOF
|
||||||
|
|
||||||
|
echo "3. Add the registry config to the nodes"
|
||||||
|
# 3. Add the registry config to the nodes
|
||||||
|
#
|
||||||
|
# This is necessary because localhost resolves to loopback addresses that are
|
||||||
|
# network-namespace local.
|
||||||
|
# In other words: localhost in the container is not localhost on the host.
|
||||||
|
#
|
||||||
|
# We want a consistent name that works from both ends, so we tell containerd to
|
||||||
|
# alias localhost:${reg_port} to the registry container when pulling images
|
||||||
|
REGISTRY_DIR="/etc/containerd/certs.d/localhost:${reg_port}"
|
||||||
|
for node in $(kind get nodes); do
|
||||||
|
docker exec "${node}" mkdir -p "${REGISTRY_DIR}"
|
||||||
|
cat <<EOF | docker exec -i "${node}" cp /dev/stdin "${REGISTRY_DIR}/hosts.toml"
|
||||||
|
[host."http://${reg_name}:5000"]
|
||||||
|
EOF
|
||||||
|
done
|
||||||
|
|
||||||
|
echo "4. Connect the registry to the cluster network if not already connected"
|
||||||
|
# 4. Connect the registry to the cluster network if not already connected
|
||||||
|
# This allows kind to bootstrap the network but ensures they're on the same network
|
||||||
|
if [ "$(docker inspect -f='{{json .NetworkSettings.Networks.kind}}' "${reg_name}")" = 'null' ]; then
|
||||||
|
docker network connect "kind" "${reg_name}"
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo "5. Document the local registry"
|
||||||
|
# 5. Document the local registry
|
||||||
|
# https://github.com/kubernetes/enhancements/tree/master/keps/sig-cluster-lifecycle/generic/1755-communicating-a-local-registry
|
||||||
|
cat <<EOF | kubectl apply -f -
|
||||||
|
apiVersion: v1
|
||||||
|
kind: ConfigMap
|
||||||
|
metadata:
|
||||||
|
name: local-registry-hosting
|
||||||
|
namespace: kube-public
|
||||||
|
data:
|
||||||
|
localRegistryHosting.v1: |
|
||||||
|
host: "localhost:${reg_port}"
|
||||||
|
help: "https://kind.sigs.k8s.io/docs/user/local-registry/"
|
||||||
|
EOF
|
||||||
|
|
||||||
|
echo "6. Install ingress-nginx"
|
||||||
|
kubectl apply -f https://raw.githubusercontent.com/kubernetes/ingress-nginx/main/deploy/static/provider/kind/deploy.yaml
|
||||||
|
kubectl -n ingress-nginx create secret tls mkcert --key /tmp/127.0.0.1.nip.io+1-key.pem --cert /tmp/127.0.0.1.nip.io+1.pem
|
||||||
|
kubectl -n ingress-nginx patch deployments.apps ingress-nginx-controller --type 'json' -p '[{"op": "add", "path": "/spec/template/spec/containers/0/args/-", "value":"--default-ssl-certificate=ingress-nginx/mkcert"}]'
|
||||||
25
bin/state
Executable file
25
bin/state
Executable file
@@ -0,0 +1,25 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
set -eo pipefail
|
||||||
|
|
||||||
|
source "$(dirname "${BASH_SOURCE[0]}")/_config.sh"
|
||||||
|
|
||||||
|
project=$(_set_openstack_project)
|
||||||
|
echo "Using \"${project}\" project..."
|
||||||
|
|
||||||
|
source "${TERRAFORM_DIRECTORY}/${project}/openrc.sh"
|
||||||
|
|
||||||
|
# Run Terraform commands in the Hashicorp docker container via docker compose
|
||||||
|
# shellcheck disable=SC2068
|
||||||
|
DOCKER_USER="$(id -u):$(id -g)" \
|
||||||
|
PROJECT="${project}" \
|
||||||
|
docker compose run --rm \
|
||||||
|
-e OS_AUTH_URL \
|
||||||
|
-e OS_IDENTITY_API_VERSION \
|
||||||
|
-e OS_USER_DOMAIN_NAME \
|
||||||
|
-e OS_PROJECT_DOMAIN_NAME \
|
||||||
|
-e OS_TENANT_ID \
|
||||||
|
-e OS_TENANT_NAME \
|
||||||
|
-e OS_USERNAME \
|
||||||
|
-e OS_PASSWORD \
|
||||||
|
-e OS_REGION_NAME \
|
||||||
|
terraform-state "$@"
|
||||||
26
bin/terraform
Executable file
26
bin/terraform
Executable file
@@ -0,0 +1,26 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
set -eo pipefail
|
||||||
|
|
||||||
|
source "$(dirname "${BASH_SOURCE[0]}")/_config.sh"
|
||||||
|
|
||||||
|
project=$(_set_openstack_project)
|
||||||
|
echo "Using \"${project}\" project..."
|
||||||
|
|
||||||
|
source "${TERRAFORM_DIRECTORY}/${project}/openrc.sh"
|
||||||
|
|
||||||
|
# Run Terraform commands in the Hashicorp docker container via docker compose
|
||||||
|
# shellcheck disable=SC2068
|
||||||
|
DOCKER_USER="$(id -u):$(id -g)" \
|
||||||
|
PROJECT="${project}" \
|
||||||
|
docker compose run --rm \
|
||||||
|
-e OS_AUTH_URL \
|
||||||
|
-e OS_IDENTITY_API_VERSION \
|
||||||
|
-e OS_USER_DOMAIN_NAME \
|
||||||
|
-e OS_PROJECT_DOMAIN_NAME \
|
||||||
|
-e OS_TENANT_ID \
|
||||||
|
-e OS_TENANT_NAME \
|
||||||
|
-e OS_USERNAME \
|
||||||
|
-e OS_PASSWORD \
|
||||||
|
-e OS_REGION_NAME \
|
||||||
|
-e TF_VAR_user_name \
|
||||||
|
terraform "$@"
|
||||||
12
bin/update_openapi_schema
Executable file
12
bin/update_openapi_schema
Executable file
@@ -0,0 +1,12 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
|
||||||
|
source "$(dirname "${BASH_SOURCE[0]}")/_config.sh"
|
||||||
|
|
||||||
|
_dc_run \
|
||||||
|
-e DJANGO_CONFIGURATION=Test \
|
||||||
|
app-dev \
|
||||||
|
python manage.py spectacular \
|
||||||
|
--api-version 'v1.0' \
|
||||||
|
--urlconf 'impress.api_urls' \
|
||||||
|
--format openapi-json \
|
||||||
|
--file /app/core/tests/swagger/swagger.json
|
||||||
3
bin/updatekeys.sh
Normal file
3
bin/updatekeys.sh
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
find . -name "*.enc.*" -exec sops updatekeys -y {} \;
|
||||||
29
crowdin/config.yml
Normal file
29
crowdin/config.yml
Normal file
@@ -0,0 +1,29 @@
|
|||||||
|
#
|
||||||
|
# Your crowdin's credentials
|
||||||
|
#
|
||||||
|
api_token_env: CROWDIN_API_TOKEN
|
||||||
|
project_id_env: CROWDIN_PROJECT_ID
|
||||||
|
base_path_env: CROWDIN_BASE_PATH
|
||||||
|
|
||||||
|
#
|
||||||
|
# Choose file structure in crowdin
|
||||||
|
# e.g. true or false
|
||||||
|
#
|
||||||
|
preserve_hierarchy: true
|
||||||
|
|
||||||
|
#
|
||||||
|
# Files configuration
|
||||||
|
#
|
||||||
|
files: [
|
||||||
|
{
|
||||||
|
source : "/backend/locale/django.pot",
|
||||||
|
dest: "/backend-impress.pot",
|
||||||
|
translation : "/backend/locale/%locale_with_underscore%/LC_MESSAGES/django.po"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
source: "/frontend/packages/i18n/locales/impress/translations-crowdin.json",
|
||||||
|
dest: "/frontend-impress.json",
|
||||||
|
translation: "/frontend/packages/i18n/locales/impress/%two_letters_code%/translations.json",
|
||||||
|
skip_untranslated_strings: true,
|
||||||
|
},
|
||||||
|
]
|
||||||
150
docker-compose.yml
Normal file
150
docker-compose.yml
Normal file
@@ -0,0 +1,150 @@
|
|||||||
|
version: '3.8'
|
||||||
|
|
||||||
|
services:
|
||||||
|
postgresql:
|
||||||
|
image: postgres:16
|
||||||
|
env_file:
|
||||||
|
- env.d/development/postgresql
|
||||||
|
ports:
|
||||||
|
- "15432:5432"
|
||||||
|
|
||||||
|
redis:
|
||||||
|
image: redis:5
|
||||||
|
|
||||||
|
mailcatcher:
|
||||||
|
image: sj26/mailcatcher:latest
|
||||||
|
ports:
|
||||||
|
- "1081:1080"
|
||||||
|
|
||||||
|
app-dev:
|
||||||
|
build:
|
||||||
|
context: .
|
||||||
|
target: backend-development
|
||||||
|
args:
|
||||||
|
DOCKER_USER: ${DOCKER_USER:-1000}
|
||||||
|
user: ${DOCKER_USER:-1000}
|
||||||
|
image: impress:backend-development
|
||||||
|
environment:
|
||||||
|
- PYLINTHOME=/app/.pylint.d
|
||||||
|
- DJANGO_CONFIGURATION=Development
|
||||||
|
env_file:
|
||||||
|
- env.d/development/common
|
||||||
|
- env.d/development/postgresql
|
||||||
|
ports:
|
||||||
|
- "8071:8000"
|
||||||
|
volumes:
|
||||||
|
- ./src/backend:/app
|
||||||
|
- ./data/static:/data/static
|
||||||
|
depends_on:
|
||||||
|
- postgresql
|
||||||
|
- mailcatcher
|
||||||
|
- redis
|
||||||
|
- nginx
|
||||||
|
|
||||||
|
celery-dev:
|
||||||
|
user: ${DOCKER_USER:-1000}
|
||||||
|
image: impress:backend-development
|
||||||
|
command: ["celery", "-A", "impress.celery_app", "worker", "-l", "DEBUG"]
|
||||||
|
environment:
|
||||||
|
- DJANGO_CONFIGURATION=Development
|
||||||
|
env_file:
|
||||||
|
- env.d/development/common
|
||||||
|
- env.d/development/postgresql
|
||||||
|
volumes:
|
||||||
|
- ./src/backend:/app
|
||||||
|
- ./data/static:/data/static
|
||||||
|
depends_on:
|
||||||
|
- app-dev
|
||||||
|
|
||||||
|
app:
|
||||||
|
build:
|
||||||
|
context: .
|
||||||
|
target: backend-production
|
||||||
|
args:
|
||||||
|
DOCKER_USER: ${DOCKER_USER:-1000}
|
||||||
|
user: ${DOCKER_USER:-1000}
|
||||||
|
image: impress:backend-production
|
||||||
|
environment:
|
||||||
|
- DJANGO_CONFIGURATION=Demo
|
||||||
|
env_file:
|
||||||
|
- env.d/development/common
|
||||||
|
- env.d/development/postgresql
|
||||||
|
depends_on:
|
||||||
|
- postgresql
|
||||||
|
- redis
|
||||||
|
|
||||||
|
celery:
|
||||||
|
user: ${DOCKER_USER:-1000}
|
||||||
|
image: impress:backend-production
|
||||||
|
command: ["celery", "-A", "impress.celery_app", "worker", "-l", "INFO"]
|
||||||
|
environment:
|
||||||
|
- DJANGO_CONFIGURATION=Demo
|
||||||
|
env_file:
|
||||||
|
- env.d/development/common
|
||||||
|
- env.d/development/postgresql
|
||||||
|
depends_on:
|
||||||
|
- app
|
||||||
|
|
||||||
|
nginx:
|
||||||
|
image: nginx:1.25
|
||||||
|
ports:
|
||||||
|
- "8083:8083"
|
||||||
|
volumes:
|
||||||
|
- ./docker/files/etc/nginx/conf.d:/etc/nginx/conf.d:ro
|
||||||
|
depends_on:
|
||||||
|
- keycloak
|
||||||
|
|
||||||
|
dockerize:
|
||||||
|
image: jwilder/dockerize
|
||||||
|
|
||||||
|
crowdin:
|
||||||
|
image: crowdin/cli:3.16.0
|
||||||
|
volumes:
|
||||||
|
- ".:/app"
|
||||||
|
env_file:
|
||||||
|
- env.d/development/crowdin
|
||||||
|
user: "${DOCKER_USER:-1000}"
|
||||||
|
working_dir: /app
|
||||||
|
|
||||||
|
node:
|
||||||
|
image: node:18
|
||||||
|
user: "${DOCKER_USER:-1000}"
|
||||||
|
environment:
|
||||||
|
HOME: /tmp
|
||||||
|
volumes:
|
||||||
|
- ".:/app"
|
||||||
|
|
||||||
|
kc_postgresql:
|
||||||
|
image: postgres:14.3
|
||||||
|
ports:
|
||||||
|
- "5433:5432"
|
||||||
|
env_file:
|
||||||
|
- env.d/development/kc_postgresql
|
||||||
|
|
||||||
|
keycloak:
|
||||||
|
image: quay.io/keycloak/keycloak:20.0.1
|
||||||
|
volumes:
|
||||||
|
- ./docker/auth/realm.json:/opt/keycloak/data/import/realm.json
|
||||||
|
command:
|
||||||
|
- start-dev
|
||||||
|
- --features=preview
|
||||||
|
- --import-realm
|
||||||
|
- --proxy=edge
|
||||||
|
- --hostname-url=http://localhost:8083
|
||||||
|
- --hostname-admin-url=http://localhost:8083/
|
||||||
|
- --hostname-strict=false
|
||||||
|
- --hostname-strict-https=false
|
||||||
|
environment:
|
||||||
|
KEYCLOAK_ADMIN: admin
|
||||||
|
KEYCLOAK_ADMIN_PASSWORD: admin
|
||||||
|
KC_DB: postgres
|
||||||
|
KC_DB_URL_HOST: kc_postgresql
|
||||||
|
KC_DB_URL_DATABASE: keycloak
|
||||||
|
KC_DB_PASSWORD: pass
|
||||||
|
KC_DB_USERNAME: impress
|
||||||
|
KC_DB_SCHEMA: public
|
||||||
|
PROXY_ADDRESS_FORWARDING: 'true'
|
||||||
|
ports:
|
||||||
|
- "8080:8080"
|
||||||
|
depends_on:
|
||||||
|
- kc_postgresql
|
||||||
2266
docker/auth/realm.json
Normal file
2266
docker/auth/realm.json
Normal file
File diff suppressed because it is too large
Load Diff
13
docker/files/etc/nginx/conf.d/default.conf
Normal file
13
docker/files/etc/nginx/conf.d/default.conf
Normal file
@@ -0,0 +1,13 @@
|
|||||||
|
|
||||||
|
server {
|
||||||
|
listen 8083;
|
||||||
|
server_name localhost;
|
||||||
|
charset utf-8;
|
||||||
|
|
||||||
|
location / {
|
||||||
|
proxy_pass http://keycloak:8080;
|
||||||
|
proxy_set_header Host $host;
|
||||||
|
proxy_set_header X-Real-IP $remote_addr;
|
||||||
|
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
|
||||||
|
}
|
||||||
|
}
|
||||||
35
docker/files/usr/local/bin/entrypoint
Executable file
35
docker/files/usr/local/bin/entrypoint
Executable file
@@ -0,0 +1,35 @@
|
|||||||
|
#!/bin/sh
|
||||||
|
#
|
||||||
|
# The container user (see USER in the Dockerfile) is an un-privileged user that
|
||||||
|
# does not exists and is not created during the build phase (see Dockerfile).
|
||||||
|
# Hence, we use this entrypoint to wrap commands that will be run in the
|
||||||
|
# container to create an entry for this user in the /etc/passwd file.
|
||||||
|
#
|
||||||
|
# The following environment variables may be passed to the container to
|
||||||
|
# customize running user account:
|
||||||
|
#
|
||||||
|
# * USER_NAME: container user name (default: default)
|
||||||
|
# * HOME : container user home directory (default: none)
|
||||||
|
#
|
||||||
|
# To pass environment variables, you can either use the -e option of the docker run command:
|
||||||
|
#
|
||||||
|
# docker run --rm -e USER_NAME=foo -e HOME='/home/foo' impress:latest python manage.py migrate
|
||||||
|
#
|
||||||
|
# or define new variables in an environment file to use with docker or docker compose:
|
||||||
|
#
|
||||||
|
# # env.d/production
|
||||||
|
# USER_NAME=foo
|
||||||
|
# HOME=/home/foo
|
||||||
|
#
|
||||||
|
# docker run --rm --env-file env.d/production impress:latest python manage.py migrate
|
||||||
|
#
|
||||||
|
|
||||||
|
echo "🐳(entrypoint) creating user running in the container..."
|
||||||
|
if ! whoami > /dev/null 2>&1; then
|
||||||
|
if [ -w /etc/passwd ]; then
|
||||||
|
echo "${USER_NAME:-default}:x:$(id -u):$(id -g):${USER_NAME:-default} user:${HOME}:/sbin/nologin" >> /etc/passwd
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo "🐳(entrypoint) running your command: ${*}"
|
||||||
|
exec "$@"
|
||||||
16
docker/files/usr/local/etc/gunicorn/impress.py
Normal file
16
docker/files/usr/local/etc/gunicorn/impress.py
Normal file
@@ -0,0 +1,16 @@
|
|||||||
|
# Gunicorn-django settings
|
||||||
|
bind = ["0.0.0.0:8000"]
|
||||||
|
name = "impress"
|
||||||
|
python_path = "/app"
|
||||||
|
|
||||||
|
# Run
|
||||||
|
graceful_timeout = 90
|
||||||
|
timeout = 90
|
||||||
|
workers = 3
|
||||||
|
|
||||||
|
# Logging
|
||||||
|
# Using '-' for the access log file makes gunicorn log accesses to stdout
|
||||||
|
accesslog = "-"
|
||||||
|
# Using '-' for the error log file makes gunicorn log errors to stderr
|
||||||
|
errorlog = "-"
|
||||||
|
loglevel = "info"
|
||||||
25
docs/tsclient.md
Normal file
25
docs/tsclient.md
Normal file
@@ -0,0 +1,25 @@
|
|||||||
|
# Api client TypeScript
|
||||||
|
|
||||||
|
The backend application can automatically create a TypeScript client to be used in frontend
|
||||||
|
applications. It is used in the impress front application itself.
|
||||||
|
|
||||||
|
This client is made with [openapi-typescript-codegen](https://github.com/ferdikoomen/openapi-typescript-codegen)
|
||||||
|
and impress's backend OpenAPI schema (available [here](http://localhost:8071/v1.0/swagger/) if you have the backend running).
|
||||||
|
|
||||||
|
## Requirements
|
||||||
|
|
||||||
|
We'll need the online OpenAPI schema generated by swagger. Therefore you will first need to
|
||||||
|
install the backend application.
|
||||||
|
|
||||||
|
## Install openApiClientJs
|
||||||
|
|
||||||
|
```sh
|
||||||
|
$ cd src/tsclient
|
||||||
|
$ yarn install
|
||||||
|
```
|
||||||
|
|
||||||
|
## Generate the client
|
||||||
|
|
||||||
|
```sh
|
||||||
|
yarn generate:api:client:local <output_path_for_generated_client>
|
||||||
|
```
|
||||||
41
env.d/development/common.dist
Normal file
41
env.d/development/common.dist
Normal file
@@ -0,0 +1,41 @@
|
|||||||
|
# Django
|
||||||
|
DJANGO_ALLOWED_HOSTS=*
|
||||||
|
DJANGO_SECRET_KEY=ThisIsAnExampleKeyForDevPurposeOnly
|
||||||
|
DJANGO_SETTINGS_MODULE=impress.settings
|
||||||
|
DJANGO_SUPERUSER_PASSWORD=admin
|
||||||
|
|
||||||
|
# Python
|
||||||
|
PYTHONPATH=/app
|
||||||
|
|
||||||
|
# impress settings
|
||||||
|
|
||||||
|
# Mail
|
||||||
|
DJANGO_EMAIL_HOST="mailcatcher"
|
||||||
|
DJANGO_EMAIL_PORT=1025
|
||||||
|
|
||||||
|
# Backend url
|
||||||
|
IMPRESS_BASE_URL="http://localhost:8072"
|
||||||
|
|
||||||
|
# Media
|
||||||
|
STORAGES_STATICFILES_BACKEND=django.contrib.staticfiles.storage.StaticFilesStorage
|
||||||
|
AWS_S3_ENDPOINT_URL=http://minio:9000
|
||||||
|
AWS_S3_ACCESS_KEY_ID=impress
|
||||||
|
AWS_S3_SECRET_ACCESS_KEY=password
|
||||||
|
|
||||||
|
# OIDC
|
||||||
|
OIDC_OP_JWKS_ENDPOINT=http://nginx:8083/realms/impress/protocol/openid-connect/certs
|
||||||
|
OIDC_OP_AUTHORIZATION_ENDPOINT=http://localhost:8083/realms/impress/protocol/openid-connect/auth
|
||||||
|
OIDC_OP_TOKEN_ENDPOINT=http://nginx:8083/realms/impress/protocol/openid-connect/token
|
||||||
|
OIDC_OP_USER_ENDPOINT=http://nginx:8083/realms/impress/protocol/openid-connect/userinfo
|
||||||
|
|
||||||
|
OIDC_RP_CLIENT_ID=impress
|
||||||
|
OIDC_RP_CLIENT_SECRET=ThisIsAnExampleKeyForDevPurposeOnly
|
||||||
|
OIDC_RP_SIGN_ALGO=RS256
|
||||||
|
OIDC_RP_SCOPES="openid email"
|
||||||
|
|
||||||
|
LOGIN_REDIRECT_URL=http://localhost:3000
|
||||||
|
LOGIN_REDIRECT_URL_FAILURE=http://localhost:3000
|
||||||
|
LOGOUT_REDIRECT_URL=http://localhost:3000
|
||||||
|
|
||||||
|
OIDC_REDIRECT_ALLOWED_HOSTS=["http://localhost:8083", "http://localhost:3000"]
|
||||||
|
OIDC_AUTH_REQUEST_EXTRA_PARAMS={"acr_values": "eidas1"}
|
||||||
3
env.d/development/common.e2e.dist
Normal file
3
env.d/development/common.e2e.dist
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
# For the CI job test-e2e
|
||||||
|
SUSTAINED_THROTTLE_RATES="200/hour"
|
||||||
|
BURST_THROTTLE_RATES="200/minute"
|
||||||
3
env.d/development/crowdin.dist
Normal file
3
env.d/development/crowdin.dist
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
CROWDIN_API_TOKEN=Your-Api-Token
|
||||||
|
CROWDIN_PROJECT_ID=Your-Project-Id
|
||||||
|
CROWDIN_BASE_PATH=/app/src
|
||||||
11
env.d/development/kc_postgresql.dist
Normal file
11
env.d/development/kc_postgresql.dist
Normal file
@@ -0,0 +1,11 @@
|
|||||||
|
# Postgresql db container configuration
|
||||||
|
POSTGRES_DB=keycloak
|
||||||
|
POSTGRES_USER=impress
|
||||||
|
POSTGRES_PASSWORD=pass
|
||||||
|
|
||||||
|
# App database configuration
|
||||||
|
DB_HOST=kc_postgresql
|
||||||
|
DB_NAME=keycloak
|
||||||
|
DB_USER=impress
|
||||||
|
DB_PASSWORD=pass
|
||||||
|
DB_PORT=5433
|
||||||
11
env.d/development/postgresql.dist
Normal file
11
env.d/development/postgresql.dist
Normal file
@@ -0,0 +1,11 @@
|
|||||||
|
# Postgresql db container configuration
|
||||||
|
POSTGRES_DB=impress
|
||||||
|
POSTGRES_USER=dinum
|
||||||
|
POSTGRES_PASSWORD=pass
|
||||||
|
|
||||||
|
# App database configuration
|
||||||
|
DB_HOST=postgresql
|
||||||
|
DB_NAME=impress
|
||||||
|
DB_USER=dinum
|
||||||
|
DB_PASSWORD=pass
|
||||||
|
DB_PORT=5432
|
||||||
37
gitlint/gitlint_emoji.py
Normal file
37
gitlint/gitlint_emoji.py
Normal file
@@ -0,0 +1,37 @@
|
|||||||
|
"""
|
||||||
|
Gitlint extra rule to validate that the message title is of the form
|
||||||
|
"<gitmoji>(<scope>) <subject>"
|
||||||
|
"""
|
||||||
|
from __future__ import unicode_literals
|
||||||
|
|
||||||
|
import re
|
||||||
|
|
||||||
|
import requests
|
||||||
|
|
||||||
|
from gitlint.rules import CommitMessageTitle, LineRule, RuleViolation
|
||||||
|
|
||||||
|
|
||||||
|
class GitmojiTitle(LineRule):
|
||||||
|
"""
|
||||||
|
This rule will enforce that each commit title is of the form "<gitmoji>(<scope>) <subject>"
|
||||||
|
where gitmoji is an emoji from the list defined in https://gitmoji.carloscuesta.me and
|
||||||
|
subject should be all lowercase
|
||||||
|
"""
|
||||||
|
|
||||||
|
id = "UC1"
|
||||||
|
name = "title-should-have-gitmoji-and-scope"
|
||||||
|
target = CommitMessageTitle
|
||||||
|
|
||||||
|
def validate(self, title, _commit):
|
||||||
|
"""
|
||||||
|
Download the list possible gitmojis from the project's github repository and check that
|
||||||
|
title contains one of them.
|
||||||
|
"""
|
||||||
|
gitmojis = requests.get(
|
||||||
|
"https://raw.githubusercontent.com/carloscuesta/gitmoji/master/packages/gitmojis/src/gitmojis.json"
|
||||||
|
).json()["gitmojis"]
|
||||||
|
emojis = [item["emoji"] for item in gitmojis]
|
||||||
|
pattern = r"^({:s})\(.*\)\s[a-z].*$".format("|".join(emojis))
|
||||||
|
if not re.search(pattern, title):
|
||||||
|
violation_msg = 'Title does not match regex "<gitmoji>(<scope>) <subject>"'
|
||||||
|
return [RuleViolation(self.id, violation_msg, title)]
|
||||||
26
renovate.json
Normal file
26
renovate.json
Normal file
@@ -0,0 +1,26 @@
|
|||||||
|
{
|
||||||
|
"extends": ["github>numerique-gouv/renovate-configuration"],
|
||||||
|
"dependencyDashboard": true,
|
||||||
|
"labels": ["dependencies", "noChangeLog"],
|
||||||
|
"packageRules": [
|
||||||
|
{
|
||||||
|
"enabled": false,
|
||||||
|
"groupName": "ignored python dependencies",
|
||||||
|
"matchManagers": ["pep621"],
|
||||||
|
"matchPackageNames": []
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"enabled": false,
|
||||||
|
"groupName": "ignored js dependencies",
|
||||||
|
"matchManagers": ["npm"],
|
||||||
|
"matchPackageNames": [
|
||||||
|
"node",
|
||||||
|
"node-fetch",
|
||||||
|
"i18next-parser",
|
||||||
|
"eslint",
|
||||||
|
"react",
|
||||||
|
"react-dom"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
30
scripts/install-hooks.sh
Executable file
30
scripts/install-hooks.sh
Executable file
@@ -0,0 +1,30 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
mkdir -p "$(dirname -- "${BASH_SOURCE[0]}")/../.git/hooks/"
|
||||||
|
PRE_COMMIT_FILE="$(dirname -- "${BASH_SOURCE[0]}")/../.git/hooks/pre-commit"
|
||||||
|
|
||||||
|
cat <<'EOF' >$PRE_COMMIT_FILE
|
||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
# directories containing potential secrets
|
||||||
|
DIRS="."
|
||||||
|
|
||||||
|
bold=$(tput bold)
|
||||||
|
normal=$(tput sgr0)
|
||||||
|
|
||||||
|
# allow to read user input, assigns stdin to keyboard
|
||||||
|
exec </dev/tty
|
||||||
|
|
||||||
|
for d in $DIRS; do
|
||||||
|
# find files containing secrets that should be encrypted
|
||||||
|
for f in $(find "${d}" -type f -regex ".*\.enc\..*"); do
|
||||||
|
if ! $(grep -q "unencrypted_suffix" $f); then
|
||||||
|
printf '\xF0\x9F\x92\xA5 '
|
||||||
|
echo "File $f has non encrypted secrets!"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
done
|
||||||
|
EOF
|
||||||
|
|
||||||
|
chmod +x $PRE_COMMIT_FILE
|
||||||
4
scripts/update-git-submodule.sh
Executable file
4
scripts/update-git-submodule.sh
Executable file
@@ -0,0 +1,4 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
git submodule update --init --recursive
|
||||||
|
git submodule foreach 'git fetch origin; git checkout $(git rev-parse --abbrev-ref HEAD); git reset --hard origin/$(git rev-parse --abbrev-ref HEAD); git submodule update --recursive; git clean -dfx'
|
||||||
3
scripts/updatekeys.sh
Executable file
3
scripts/updatekeys.sh
Executable file
@@ -0,0 +1,3 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
find . -name "*.enc.*" -exec sops updatekeys -y {} \;
|
||||||
1
secrets
Submodule
1
secrets
Submodule
Submodule secrets added at a2b1357c0a
472
src/backend/.pylintrc
Normal file
472
src/backend/.pylintrc
Normal file
@@ -0,0 +1,472 @@
|
|||||||
|
[MASTER]
|
||||||
|
|
||||||
|
# A comma-separated list of package or module names from where C extensions may
|
||||||
|
# be loaded. Extensions are loading into the active Python interpreter and may
|
||||||
|
# run arbitrary code
|
||||||
|
extension-pkg-whitelist=
|
||||||
|
|
||||||
|
# Add files or directories to the blacklist. They should be base names, not
|
||||||
|
# paths.
|
||||||
|
ignore=migrations
|
||||||
|
|
||||||
|
# Add files or directories matching the regex patterns to the blacklist. The
|
||||||
|
# regex matches against base names, not paths.
|
||||||
|
ignore-patterns=
|
||||||
|
|
||||||
|
# Python code to execute, usually for sys.path manipulation such as
|
||||||
|
# pygtk.require().
|
||||||
|
#init-hook=
|
||||||
|
|
||||||
|
# Use multiple processes to speed up Pylint. Specifying 0 will auto-detect the
|
||||||
|
# number of processors available to use.
|
||||||
|
jobs=0
|
||||||
|
|
||||||
|
# List of plugins (as comma separated values of python modules names) to load,
|
||||||
|
# usually to register additional checkers.
|
||||||
|
load-plugins=pylint_django,pylint.extensions.no_self_use
|
||||||
|
|
||||||
|
# Pickle collected data for later comparisons.
|
||||||
|
persistent=yes
|
||||||
|
|
||||||
|
# Specify a configuration file.
|
||||||
|
#rcfile=
|
||||||
|
|
||||||
|
# When enabled, pylint would attempt to guess common misconfiguration and emit
|
||||||
|
# user-friendly hints instead of false-positive error messages
|
||||||
|
suggestion-mode=yes
|
||||||
|
|
||||||
|
# Allow loading of arbitrary C extensions. Extensions are imported into the
|
||||||
|
# active Python interpreter and may run arbitrary code.
|
||||||
|
unsafe-load-any-extension=no
|
||||||
|
|
||||||
|
|
||||||
|
[MESSAGES CONTROL]
|
||||||
|
|
||||||
|
# Only show warnings with the listed confidence levels. Leave empty to show
|
||||||
|
# all. Valid levels: HIGH, INFERENCE, INFERENCE_FAILURE, UNDEFINED
|
||||||
|
confidence=
|
||||||
|
|
||||||
|
# Disable the message, report, category or checker with the given id(s). You
|
||||||
|
# can either give multiple identifiers separated by comma (,) or put this
|
||||||
|
# option multiple times (only on the command line, not in the configuration
|
||||||
|
# file where it should appear only once).You can also use "--disable=all" to
|
||||||
|
# disable everything first and then reenable specific checks. For example, if
|
||||||
|
# you want to run only the similarities checker, you can use "--disable=all
|
||||||
|
# --enable=similarities". If you want to run only the classes checker, but have
|
||||||
|
# no Warning level messages displayed, use"--disable=all --enable=classes
|
||||||
|
# --disable=W"
|
||||||
|
disable=bad-inline-option,
|
||||||
|
deprecated-pragma,
|
||||||
|
django-not-configured,
|
||||||
|
file-ignored,
|
||||||
|
locally-disabled,
|
||||||
|
no-self-use,
|
||||||
|
raw-checker-failed,
|
||||||
|
suppressed-message,
|
||||||
|
useless-suppression
|
||||||
|
|
||||||
|
# Enable the message, report, category or checker with the given id(s). You can
|
||||||
|
# either give multiple identifier separated by comma (,) or put this option
|
||||||
|
# multiple time (only on the command line, not in the configuration file where
|
||||||
|
# it should appear only once). See also the "--disable" option for examples.
|
||||||
|
enable=c-extension-no-member
|
||||||
|
|
||||||
|
|
||||||
|
[REPORTS]
|
||||||
|
|
||||||
|
# Python expression which should return a note less than 10 (10 is the highest
|
||||||
|
# note). You have access to the variables errors warning, statement which
|
||||||
|
# respectively contain the number of errors / warnings messages and the total
|
||||||
|
# number of statements analyzed. This is used by the global evaluation report
|
||||||
|
# (RP0004).
|
||||||
|
evaluation=10.0 - ((float(5 * error + warning + refactor + convention) / statement) * 10)
|
||||||
|
|
||||||
|
# Template used to display messages. This is a python new-style format string
|
||||||
|
# used to format the message information. See doc for all details
|
||||||
|
#msg-template=
|
||||||
|
|
||||||
|
# Set the output format. Available formats are text, parseable, colorized, json
|
||||||
|
# and msvs (visual studio).You can also give a reporter class, eg
|
||||||
|
# mypackage.mymodule.MyReporterClass.
|
||||||
|
output-format=text
|
||||||
|
|
||||||
|
# Tells whether to display a full report or only the messages
|
||||||
|
reports=no
|
||||||
|
|
||||||
|
# Activate the evaluation score.
|
||||||
|
score=yes
|
||||||
|
|
||||||
|
|
||||||
|
[REFACTORING]
|
||||||
|
|
||||||
|
# Maximum number of nested blocks for function / method body
|
||||||
|
max-nested-blocks=5
|
||||||
|
|
||||||
|
# Complete name of functions that never returns. When checking for
|
||||||
|
# inconsistent-return-statements if a never returning function is called then
|
||||||
|
# it will be considered as an explicit return statement and no message will be
|
||||||
|
# printed.
|
||||||
|
never-returning-functions=optparse.Values,sys.exit
|
||||||
|
|
||||||
|
|
||||||
|
[LOGGING]
|
||||||
|
|
||||||
|
# Logging modules to check that the string format arguments are in logging
|
||||||
|
# function parameter format
|
||||||
|
logging-modules=logging
|
||||||
|
|
||||||
|
|
||||||
|
[SPELLING]
|
||||||
|
|
||||||
|
# Limits count of emitted suggestions for spelling mistakes
|
||||||
|
max-spelling-suggestions=4
|
||||||
|
|
||||||
|
# Spelling dictionary name. Available dictionaries: none. To make it working
|
||||||
|
# install python-enchant package.
|
||||||
|
spelling-dict=
|
||||||
|
|
||||||
|
# List of comma separated words that should not be checked.
|
||||||
|
spelling-ignore-words=
|
||||||
|
|
||||||
|
# A path to a file that contains private dictionary; one word per line.
|
||||||
|
spelling-private-dict-file=
|
||||||
|
|
||||||
|
# Tells whether to store unknown words to indicated private dictionary in
|
||||||
|
# --spelling-private-dict-file option instead of raising a message.
|
||||||
|
spelling-store-unknown-words=no
|
||||||
|
|
||||||
|
|
||||||
|
[MISCELLANEOUS]
|
||||||
|
|
||||||
|
# List of note tags to take in consideration, separated by a comma.
|
||||||
|
notes=FIXME,
|
||||||
|
XXX,
|
||||||
|
TODO
|
||||||
|
|
||||||
|
|
||||||
|
[TYPECHECK]
|
||||||
|
|
||||||
|
# List of decorators that produce context managers, such as
|
||||||
|
# contextlib.contextmanager. Add to this list to register other decorators that
|
||||||
|
# produce valid context managers.
|
||||||
|
contextmanager-decorators=contextlib.contextmanager
|
||||||
|
|
||||||
|
# List of members which are set dynamically and missed by pylint inference
|
||||||
|
# system, and so shouldn't trigger E1101 when accessed. Python regular
|
||||||
|
# expressions are accepted.
|
||||||
|
generated-members=
|
||||||
|
|
||||||
|
# Tells whether missing members accessed in mixin class should be ignored. A
|
||||||
|
# mixin class is detected if its name ends with "mixin" (case insensitive).
|
||||||
|
ignore-mixin-members=yes
|
||||||
|
|
||||||
|
# This flag controls whether pylint should warn about no-member and similar
|
||||||
|
# checks whenever an opaque object is returned when inferring. The inference
|
||||||
|
# can return multiple potential results while evaluating a Python object, but
|
||||||
|
# some branches might not be evaluated, which results in partial inference. In
|
||||||
|
# that case, it might be useful to still emit no-member and other checks for
|
||||||
|
# the rest of the inferred objects.
|
||||||
|
ignore-on-opaque-inference=yes
|
||||||
|
|
||||||
|
# List of class names for which member attributes should not be checked (useful
|
||||||
|
# for classes with dynamically set attributes). This supports the use of
|
||||||
|
# qualified names.
|
||||||
|
ignored-classes=optparse.Values,thread._local,_thread._local,responses,
|
||||||
|
Template,Contact
|
||||||
|
|
||||||
|
# List of module names for which member attributes should not be checked
|
||||||
|
# (useful for modules/projects where namespaces are manipulated during runtime
|
||||||
|
# and thus existing member attributes cannot be deduced by static analysis. It
|
||||||
|
# supports qualified module names, as well as Unix pattern matching.
|
||||||
|
ignored-modules=
|
||||||
|
|
||||||
|
# Show a hint with possible names when a member name was not found. The aspect
|
||||||
|
# of finding the hint is based on edit distance.
|
||||||
|
missing-member-hint=yes
|
||||||
|
|
||||||
|
# The minimum edit distance a name should have in order to be considered a
|
||||||
|
# similar match for a missing member name.
|
||||||
|
missing-member-hint-distance=1
|
||||||
|
|
||||||
|
# The total number of similar names that should be taken in consideration when
|
||||||
|
# showing a hint for a missing member.
|
||||||
|
missing-member-max-choices=1
|
||||||
|
|
||||||
|
|
||||||
|
[VARIABLES]
|
||||||
|
|
||||||
|
# List of additional names supposed to be defined in builtins. Remember that
|
||||||
|
# you should avoid to define new builtins when possible.
|
||||||
|
additional-builtins=
|
||||||
|
|
||||||
|
# Tells whether unused global variables should be treated as a violation.
|
||||||
|
allow-global-unused-variables=yes
|
||||||
|
|
||||||
|
# List of strings which can identify a callback function by name. A callback
|
||||||
|
# name must start or end with one of those strings.
|
||||||
|
callbacks=cb_,
|
||||||
|
_cb
|
||||||
|
|
||||||
|
# A regular expression matching the name of dummy variables (i.e. expectedly
|
||||||
|
# not used).
|
||||||
|
dummy-variables-rgx=_+$|(_[a-zA-Z0-9_]*[a-zA-Z0-9]+?$)|dummy|^ignored_|^unused_
|
||||||
|
|
||||||
|
# Argument names that match this expression will be ignored. Default to name
|
||||||
|
# with leading underscore
|
||||||
|
ignored-argument-names=_.*|^ignored_|^unused_
|
||||||
|
|
||||||
|
# Tells whether we should check for unused import in __init__ files.
|
||||||
|
init-import=no
|
||||||
|
|
||||||
|
# List of qualified module names which can have objects that can redefine
|
||||||
|
# builtins.
|
||||||
|
redefining-builtins-modules=six.moves,past.builtins,future.builtins
|
||||||
|
|
||||||
|
|
||||||
|
[FORMAT]
|
||||||
|
|
||||||
|
# Expected format of line ending, e.g. empty (any line ending), LF or CRLF.
|
||||||
|
expected-line-ending-format=
|
||||||
|
|
||||||
|
# Regexp for a line that is allowed to be longer than the limit.
|
||||||
|
ignore-long-lines=^\s*(# )?<?https?://\S+>?$
|
||||||
|
|
||||||
|
# Number of spaces of indent required inside a hanging or continued line.
|
||||||
|
indent-after-paren=4
|
||||||
|
|
||||||
|
# String used as indentation unit. This is usually " " (4 spaces) or "\t" (1
|
||||||
|
# tab).
|
||||||
|
indent-string=' '
|
||||||
|
|
||||||
|
# Maximum number of characters on a single line.
|
||||||
|
max-line-length=100
|
||||||
|
|
||||||
|
# Maximum number of lines in a module
|
||||||
|
max-module-lines=1000
|
||||||
|
|
||||||
|
# Allow the body of a class to be on the same line as the declaration if body
|
||||||
|
# contains single statement.
|
||||||
|
single-line-class-stmt=no
|
||||||
|
|
||||||
|
# Allow the body of an if to be on the same line as the test if there is no
|
||||||
|
# else.
|
||||||
|
single-line-if-stmt=no
|
||||||
|
|
||||||
|
|
||||||
|
[SIMILARITIES]
|
||||||
|
|
||||||
|
# Ignore comments when computing similarities.
|
||||||
|
ignore-comments=yes
|
||||||
|
|
||||||
|
# Ignore docstrings when computing similarities.
|
||||||
|
ignore-docstrings=yes
|
||||||
|
|
||||||
|
# Ignore imports when computing similarities.
|
||||||
|
ignore-imports=yes
|
||||||
|
|
||||||
|
# Minimum lines number of a similarity.
|
||||||
|
# First implementations of CMS wizards have common fields we do not want to factorize for now
|
||||||
|
min-similarity-lines=35
|
||||||
|
|
||||||
|
|
||||||
|
[BASIC]
|
||||||
|
|
||||||
|
# Naming style matching correct argument names
|
||||||
|
argument-naming-style=snake_case
|
||||||
|
|
||||||
|
# Regular expression matching correct argument names. Overrides argument-
|
||||||
|
# naming-style
|
||||||
|
#argument-rgx=
|
||||||
|
|
||||||
|
# Naming style matching correct attribute names
|
||||||
|
attr-naming-style=snake_case
|
||||||
|
|
||||||
|
# Regular expression matching correct attribute names. Overrides attr-naming-
|
||||||
|
# style
|
||||||
|
#attr-rgx=
|
||||||
|
|
||||||
|
# Bad variable names which should always be refused, separated by a comma
|
||||||
|
bad-names=foo,
|
||||||
|
bar,
|
||||||
|
baz,
|
||||||
|
toto,
|
||||||
|
tutu,
|
||||||
|
tata
|
||||||
|
|
||||||
|
# Naming style matching correct class attribute names
|
||||||
|
class-attribute-naming-style=any
|
||||||
|
|
||||||
|
# Regular expression matching correct class attribute names. Overrides class-
|
||||||
|
# attribute-naming-style
|
||||||
|
#class-attribute-rgx=
|
||||||
|
|
||||||
|
# Naming style matching correct class names
|
||||||
|
class-naming-style=PascalCase
|
||||||
|
|
||||||
|
# Regular expression matching correct class names. Overrides class-naming-style
|
||||||
|
#class-rgx=
|
||||||
|
|
||||||
|
# Naming style matching correct constant names
|
||||||
|
const-naming-style=UPPER_CASE
|
||||||
|
|
||||||
|
# Regular expression matching correct constant names. Overrides const-naming-
|
||||||
|
# style
|
||||||
|
const-rgx=(([A-Z_][A-Z0-9_]*)|(__.*__)|urlpatterns|logger)$
|
||||||
|
|
||||||
|
# Minimum line length for functions/classes that require docstrings, shorter
|
||||||
|
# ones are exempt.
|
||||||
|
docstring-min-length=-1
|
||||||
|
|
||||||
|
# Naming style matching correct function names
|
||||||
|
function-naming-style=snake_case
|
||||||
|
|
||||||
|
# Regular expression matching correct function names. Overrides function-
|
||||||
|
# naming-style
|
||||||
|
#function-rgx=
|
||||||
|
|
||||||
|
# Good variable names which should always be accepted, separated by a comma
|
||||||
|
good-names=i,
|
||||||
|
j,
|
||||||
|
k,
|
||||||
|
cm,
|
||||||
|
ex,
|
||||||
|
Run,
|
||||||
|
_
|
||||||
|
|
||||||
|
# Include a hint for the correct naming format with invalid-name
|
||||||
|
include-naming-hint=no
|
||||||
|
|
||||||
|
# Naming style matching correct inline iteration names
|
||||||
|
inlinevar-naming-style=any
|
||||||
|
|
||||||
|
# Regular expression matching correct inline iteration names. Overrides
|
||||||
|
# inlinevar-naming-style
|
||||||
|
#inlinevar-rgx=
|
||||||
|
|
||||||
|
# Naming style matching correct method names
|
||||||
|
method-naming-style=snake_case
|
||||||
|
|
||||||
|
# Regular expression matching correct method names. Overrides method-naming-
|
||||||
|
# style
|
||||||
|
method-rgx=([a-z_][a-z0-9_]{2,50}|setUp|set[Uu]pClass|tearDown|tear[Dd]ownClass|assert[A-Z]\w*|maxDiff|test_[a-z0-9_]+)$
|
||||||
|
|
||||||
|
# Naming style matching correct module names
|
||||||
|
module-naming-style=snake_case
|
||||||
|
|
||||||
|
# Regular expression matching correct module names. Overrides module-naming-
|
||||||
|
# style
|
||||||
|
#module-rgx=
|
||||||
|
|
||||||
|
# Colon-delimited sets of names that determine each other's naming style when
|
||||||
|
# the name regexes allow several styles.
|
||||||
|
name-group=
|
||||||
|
|
||||||
|
# Regular expression which should only match function or class names that do
|
||||||
|
# not require a docstring.
|
||||||
|
no-docstring-rgx=^_
|
||||||
|
|
||||||
|
# List of decorators that produce properties, such as abc.abstractproperty. Add
|
||||||
|
# to this list to register other decorators that produce valid properties.
|
||||||
|
property-classes=abc.abstractproperty
|
||||||
|
|
||||||
|
# Naming style matching correct variable names
|
||||||
|
variable-naming-style=snake_case
|
||||||
|
|
||||||
|
# Regular expression matching correct variable names. Overrides variable-
|
||||||
|
# naming-style
|
||||||
|
#variable-rgx=
|
||||||
|
|
||||||
|
|
||||||
|
[IMPORTS]
|
||||||
|
|
||||||
|
# Allow wildcard imports from modules that define __all__.
|
||||||
|
allow-wildcard-with-all=no
|
||||||
|
|
||||||
|
# Analyse import fallback blocks. This can be used to support both Python 2 and
|
||||||
|
# 3 compatible code, which means that the block might have code that exists
|
||||||
|
# only in one or another interpreter, leading to false positives when analysed.
|
||||||
|
analyse-fallback-blocks=no
|
||||||
|
|
||||||
|
# Deprecated modules which should not be used, separated by a comma
|
||||||
|
deprecated-modules=optparse,tkinter.tix
|
||||||
|
|
||||||
|
# Create a graph of external dependencies in the given file (report RP0402 must
|
||||||
|
# not be disabled)
|
||||||
|
ext-import-graph=
|
||||||
|
|
||||||
|
# Create a graph of every (i.e. internal and external) dependencies in the
|
||||||
|
# given file (report RP0402 must not be disabled)
|
||||||
|
import-graph=
|
||||||
|
|
||||||
|
# Create a graph of internal dependencies in the given file (report RP0402 must
|
||||||
|
# not be disabled)
|
||||||
|
int-import-graph=
|
||||||
|
|
||||||
|
# Force import order to recognize a module as part of the standard
|
||||||
|
# compatibility libraries.
|
||||||
|
known-standard-library=
|
||||||
|
|
||||||
|
# Force import order to recognize a module as part of a third party library.
|
||||||
|
known-third-party=enchant
|
||||||
|
|
||||||
|
|
||||||
|
[CLASSES]
|
||||||
|
|
||||||
|
# List of method names used to declare (i.e. assign) instance attributes.
|
||||||
|
defining-attr-methods=__init__,
|
||||||
|
__new__,
|
||||||
|
setUp
|
||||||
|
|
||||||
|
# List of member names, which should be excluded from the protected access
|
||||||
|
# warning.
|
||||||
|
exclude-protected=_asdict,
|
||||||
|
_fields,
|
||||||
|
_replace,
|
||||||
|
_source,
|
||||||
|
_make
|
||||||
|
|
||||||
|
# List of valid names for the first argument in a class method.
|
||||||
|
valid-classmethod-first-arg=cls
|
||||||
|
|
||||||
|
# List of valid names for the first argument in a metaclass class method.
|
||||||
|
valid-metaclass-classmethod-first-arg=mcs
|
||||||
|
|
||||||
|
|
||||||
|
[DESIGN]
|
||||||
|
|
||||||
|
# Maximum number of arguments for function / method
|
||||||
|
max-args=5
|
||||||
|
|
||||||
|
# Maximum number of attributes for a class (see R0902).
|
||||||
|
max-attributes=7
|
||||||
|
|
||||||
|
# Maximum number of boolean expressions in a if statement
|
||||||
|
max-bool-expr=5
|
||||||
|
|
||||||
|
# Maximum number of branch for function / method body
|
||||||
|
max-branches=12
|
||||||
|
|
||||||
|
# Maximum number of locals for function / method body
|
||||||
|
max-locals=15
|
||||||
|
|
||||||
|
# Maximum number of parents for a class (see R0901).
|
||||||
|
max-parents=7
|
||||||
|
|
||||||
|
# Maximum number of public methods for a class (see R0904).
|
||||||
|
max-public-methods=20
|
||||||
|
|
||||||
|
# Maximum number of return / yield for function / method body
|
||||||
|
max-returns=6
|
||||||
|
|
||||||
|
# Maximum number of statements in function / method body
|
||||||
|
max-statements=50
|
||||||
|
|
||||||
|
# Minimum number of public methods for a class (see R0903).
|
||||||
|
min-public-methods=0
|
||||||
|
|
||||||
|
|
||||||
|
[EXCEPTIONS]
|
||||||
|
|
||||||
|
# Exceptions that will emit a warning when being caught. Defaults to
|
||||||
|
# "Exception"
|
||||||
|
overgeneral-exceptions=builtins.Exception
|
||||||
3
src/backend/MANIFEST.in
Normal file
3
src/backend/MANIFEST.in
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
include LICENSE
|
||||||
|
include README.md
|
||||||
|
recursive-include src/backend/impress *.html *.png *.gif *.css *.ico *.jpg *.jpeg *.po *.mo *.eot *.svg *.ttf *.woff *.woff2
|
||||||
0
src/backend/__init__.py
Normal file
0
src/backend/__init__.py
Normal file
0
src/backend/core/__init__.py
Normal file
0
src/backend/core/__init__.py
Normal file
64
src/backend/core/admin.py
Normal file
64
src/backend/core/admin.py
Normal file
@@ -0,0 +1,64 @@
|
|||||||
|
"""Admin classes and registrations for core app."""
|
||||||
|
from django.contrib import admin
|
||||||
|
from django.contrib.auth import admin as auth_admin
|
||||||
|
from django.utils.translation import gettext_lazy as _
|
||||||
|
|
||||||
|
from . import models
|
||||||
|
|
||||||
|
|
||||||
|
@admin.register(models.User)
|
||||||
|
class UserAdmin(auth_admin.UserAdmin):
|
||||||
|
"""Admin class for the User model"""
|
||||||
|
|
||||||
|
fieldsets = (
|
||||||
|
(
|
||||||
|
None,
|
||||||
|
{
|
||||||
|
"fields": (
|
||||||
|
"id",
|
||||||
|
"admin_email",
|
||||||
|
"password",
|
||||||
|
)
|
||||||
|
},
|
||||||
|
),
|
||||||
|
(_("Personal info"), {"fields": ("sub", "email", "language", "timezone")}),
|
||||||
|
(
|
||||||
|
_("Permissions"),
|
||||||
|
{
|
||||||
|
"fields": (
|
||||||
|
"is_active",
|
||||||
|
"is_device",
|
||||||
|
"is_staff",
|
||||||
|
"is_superuser",
|
||||||
|
"groups",
|
||||||
|
"user_permissions",
|
||||||
|
),
|
||||||
|
},
|
||||||
|
),
|
||||||
|
(_("Important dates"), {"fields": ("created_at", "updated_at")}),
|
||||||
|
)
|
||||||
|
add_fieldsets = (
|
||||||
|
(
|
||||||
|
None,
|
||||||
|
{
|
||||||
|
"classes": ("wide",),
|
||||||
|
"fields": ("email", "password1", "password2"),
|
||||||
|
},
|
||||||
|
),
|
||||||
|
)
|
||||||
|
list_display = (
|
||||||
|
"id",
|
||||||
|
"sub",
|
||||||
|
"admin_email",
|
||||||
|
"email",
|
||||||
|
"is_active",
|
||||||
|
"is_staff",
|
||||||
|
"is_superuser",
|
||||||
|
"is_device",
|
||||||
|
"created_at",
|
||||||
|
"updated_at",
|
||||||
|
)
|
||||||
|
list_filter = ("is_staff", "is_superuser", "is_device", "is_active")
|
||||||
|
ordering = ("is_active", "-is_superuser", "-is_staff", "-is_device", "-updated_at")
|
||||||
|
readonly_fields = ("id", "sub", "email", "created_at", "updated_at")
|
||||||
|
search_fields = ("id", "sub", "admin_email", "email")
|
||||||
39
src/backend/core/api/__init__.py
Normal file
39
src/backend/core/api/__init__.py
Normal file
@@ -0,0 +1,39 @@
|
|||||||
|
"""Impress core API endpoints"""
|
||||||
|
from django.conf import settings
|
||||||
|
from django.core.exceptions import ValidationError
|
||||||
|
|
||||||
|
from rest_framework import exceptions as drf_exceptions
|
||||||
|
from rest_framework import views as drf_views
|
||||||
|
from rest_framework.decorators import api_view
|
||||||
|
from rest_framework.response import Response
|
||||||
|
|
||||||
|
|
||||||
|
def exception_handler(exc, context):
|
||||||
|
"""Handle Django ValidationError as an accepted exception.
|
||||||
|
|
||||||
|
For the parameters, see ``exception_handler``
|
||||||
|
This code comes from twidi's gist:
|
||||||
|
https://gist.github.com/twidi/9d55486c36b6a51bdcb05ce3a763e79f
|
||||||
|
"""
|
||||||
|
if isinstance(exc, ValidationError):
|
||||||
|
if hasattr(exc, "message_dict"):
|
||||||
|
detail = exc.message_dict
|
||||||
|
elif hasattr(exc, "message"):
|
||||||
|
detail = exc.message
|
||||||
|
elif hasattr(exc, "messages"):
|
||||||
|
detail = exc.messages
|
||||||
|
|
||||||
|
exc = drf_exceptions.ValidationError(detail=detail)
|
||||||
|
|
||||||
|
return drf_views.exception_handler(exc, context)
|
||||||
|
|
||||||
|
|
||||||
|
# pylint: disable=unused-argument
|
||||||
|
@api_view(["GET"])
|
||||||
|
def get_frontend_configuration(request):
|
||||||
|
"""Returns the frontend configuration dict as configured in settings."""
|
||||||
|
frontend_configuration = {
|
||||||
|
"LANGUAGE_CODE": settings.LANGUAGE_CODE,
|
||||||
|
}
|
||||||
|
frontend_configuration.update(settings.FRONTEND_CONFIGURATION)
|
||||||
|
return Response(frontend_configuration)
|
||||||
36
src/backend/core/api/permissions.py
Normal file
36
src/backend/core/api/permissions.py
Normal file
@@ -0,0 +1,36 @@
|
|||||||
|
"""Permission handlers for the impress core app."""
|
||||||
|
from rest_framework import permissions
|
||||||
|
|
||||||
|
ACTION_FOR_METHOD_TO_PERMISSION = {
|
||||||
|
"versions_detail": {"DELETE": "versions_destroy", "GET": "versions_retrieve"}
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
class IsAuthenticated(permissions.BasePermission):
|
||||||
|
"""
|
||||||
|
Allows access only to authenticated users. Alternative method checking the presence
|
||||||
|
of the auth token to avoid hitting the database.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def has_permission(self, request, view):
|
||||||
|
return bool(request.auth) or request.user.is_authenticated
|
||||||
|
|
||||||
|
|
||||||
|
class IsAuthenticatedOrSafe(IsAuthenticated):
|
||||||
|
"""Allows access to authenticated users (or anonymous users but only on safe methods)."""
|
||||||
|
|
||||||
|
def has_permission(self, request, view):
|
||||||
|
if request.method in permissions.SAFE_METHODS:
|
||||||
|
return True
|
||||||
|
return super().has_permission(request, view)
|
||||||
|
|
||||||
|
|
||||||
|
class IsSelf(IsAuthenticated):
|
||||||
|
"""
|
||||||
|
Allows access only to authenticated users. Alternative method checking the presence
|
||||||
|
of the auth token to avoid hitting the database.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def has_object_permission(self, request, view, obj):
|
||||||
|
"""Write permissions are only allowed to the user itself."""
|
||||||
|
return obj == request.user
|
||||||
13
src/backend/core/api/serializers.py
Normal file
13
src/backend/core/api/serializers.py
Normal file
@@ -0,0 +1,13 @@
|
|||||||
|
"""Client serializers for the impress core app."""
|
||||||
|
from rest_framework import serializers
|
||||||
|
|
||||||
|
from core import models
|
||||||
|
|
||||||
|
|
||||||
|
class UserSerializer(serializers.ModelSerializer):
|
||||||
|
"""Serialize users."""
|
||||||
|
|
||||||
|
class Meta:
|
||||||
|
model = models.User
|
||||||
|
fields = ["id", "email"]
|
||||||
|
read_only_fields = ["id", "email"]
|
||||||
142
src/backend/core/api/viewsets.py
Normal file
142
src/backend/core/api/viewsets.py
Normal file
@@ -0,0 +1,142 @@
|
|||||||
|
"""API endpoints"""
|
||||||
|
from rest_framework import (
|
||||||
|
decorators,
|
||||||
|
mixins,
|
||||||
|
pagination,
|
||||||
|
viewsets,
|
||||||
|
)
|
||||||
|
from rest_framework import (
|
||||||
|
response as drf_response,
|
||||||
|
)
|
||||||
|
|
||||||
|
from core import models
|
||||||
|
|
||||||
|
from . import permissions, serializers
|
||||||
|
|
||||||
|
# pylint: disable=too-many-ancestors
|
||||||
|
|
||||||
|
|
||||||
|
class NestedGenericViewSet(viewsets.GenericViewSet):
|
||||||
|
"""
|
||||||
|
A generic Viewset aims to be used in a nested route context.
|
||||||
|
e.g: `/api/v1.0/resource_1/<resource_1_pk>/resource_2/<resource_2_pk>/`
|
||||||
|
|
||||||
|
It allows to define all url kwargs and lookup fields to perform the lookup.
|
||||||
|
"""
|
||||||
|
|
||||||
|
lookup_fields: list[str] = ["pk"]
|
||||||
|
lookup_url_kwargs: list[str] = []
|
||||||
|
|
||||||
|
def __getattribute__(self, item):
|
||||||
|
"""
|
||||||
|
This method is overridden to allow to get the last lookup field or lookup url kwarg
|
||||||
|
when accessing the `lookup_field` or `lookup_url_kwarg` attribute. This is useful
|
||||||
|
to keep compatibility with all methods used by the parent class `GenericViewSet`.
|
||||||
|
"""
|
||||||
|
if item in ["lookup_field", "lookup_url_kwarg"]:
|
||||||
|
return getattr(self, item + "s", [None])[-1]
|
||||||
|
|
||||||
|
return super().__getattribute__(item)
|
||||||
|
|
||||||
|
def get_queryset(self):
|
||||||
|
"""
|
||||||
|
Get the list of items for this view.
|
||||||
|
|
||||||
|
`lookup_fields` attribute is enumerated here to perform the nested lookup.
|
||||||
|
"""
|
||||||
|
queryset = super().get_queryset()
|
||||||
|
|
||||||
|
# The last lookup field is removed to perform the nested lookup as it corresponds
|
||||||
|
# to the object pk, it is used within get_object method.
|
||||||
|
lookup_url_kwargs = (
|
||||||
|
self.lookup_url_kwargs[:-1]
|
||||||
|
if self.lookup_url_kwargs
|
||||||
|
else self.lookup_fields[:-1]
|
||||||
|
)
|
||||||
|
|
||||||
|
filter_kwargs = {}
|
||||||
|
for index, lookup_url_kwarg in enumerate(lookup_url_kwargs):
|
||||||
|
if lookup_url_kwarg not in self.kwargs:
|
||||||
|
raise KeyError(
|
||||||
|
f"Expected view {self.__class__.__name__} to be called with a URL "
|
||||||
|
f'keyword argument named "{lookup_url_kwarg}". Fix your URL conf, or '
|
||||||
|
"set the `.lookup_fields` attribute on the view correctly."
|
||||||
|
)
|
||||||
|
|
||||||
|
filter_kwargs.update(
|
||||||
|
{self.lookup_fields[index]: self.kwargs[lookup_url_kwarg]}
|
||||||
|
)
|
||||||
|
|
||||||
|
return queryset.filter(**filter_kwargs)
|
||||||
|
|
||||||
|
|
||||||
|
class SerializerPerActionMixin:
|
||||||
|
"""
|
||||||
|
A mixin to allow to define serializer classes for each action.
|
||||||
|
|
||||||
|
This mixin is useful to avoid to define a serializer class for each action in the
|
||||||
|
`get_serializer_class` method.
|
||||||
|
"""
|
||||||
|
|
||||||
|
serializer_classes: dict[str, type] = {}
|
||||||
|
default_serializer_class: type = None
|
||||||
|
|
||||||
|
def get_serializer_class(self):
|
||||||
|
"""
|
||||||
|
Return the serializer class to use depending on the action.
|
||||||
|
"""
|
||||||
|
return self.serializer_classes.get(self.action, self.default_serializer_class)
|
||||||
|
|
||||||
|
|
||||||
|
class Pagination(pagination.PageNumberPagination):
|
||||||
|
"""Pagination to display no more than 100 objects per page sorted by creation date."""
|
||||||
|
|
||||||
|
ordering = "-created_on"
|
||||||
|
max_page_size = 100
|
||||||
|
page_size_query_param = "page_size"
|
||||||
|
|
||||||
|
|
||||||
|
class UserViewSet(
|
||||||
|
mixins.UpdateModelMixin, viewsets.GenericViewSet, mixins.ListModelMixin
|
||||||
|
):
|
||||||
|
"""User ViewSet"""
|
||||||
|
|
||||||
|
permission_classes = [permissions.IsSelf]
|
||||||
|
queryset = models.User.objects.all()
|
||||||
|
serializer_class = serializers.UserSerializer
|
||||||
|
|
||||||
|
def get_queryset(self):
|
||||||
|
"""
|
||||||
|
Limit listed users by querying the email field with a trigram similarity
|
||||||
|
search if a query is provided.
|
||||||
|
Limit listed users by excluding users already in the document if a document_id
|
||||||
|
is provided.
|
||||||
|
"""
|
||||||
|
queryset = self.queryset
|
||||||
|
|
||||||
|
if self.action == "list":
|
||||||
|
# Exclude all users already in the given document
|
||||||
|
if document_id := self.request.GET.get("document_id", ""):
|
||||||
|
queryset = queryset.exclude(documentaccess__document_id=document_id)
|
||||||
|
|
||||||
|
# Filter users by email similarity
|
||||||
|
if query := self.request.GET.get("q", ""):
|
||||||
|
queryset = queryset.filter(email__trigram_word_similar=query)
|
||||||
|
|
||||||
|
return queryset
|
||||||
|
|
||||||
|
@decorators.action(
|
||||||
|
detail=False,
|
||||||
|
methods=["get"],
|
||||||
|
url_name="me",
|
||||||
|
url_path="me",
|
||||||
|
permission_classes=[permissions.IsAuthenticated],
|
||||||
|
)
|
||||||
|
def get_me(self, request):
|
||||||
|
"""
|
||||||
|
Return information on currently logged user
|
||||||
|
"""
|
||||||
|
context = {"request": request}
|
||||||
|
return drf_response.Response(
|
||||||
|
self.serializer_class(request.user, context=context).data
|
||||||
|
)
|
||||||
11
src/backend/core/apps.py
Normal file
11
src/backend/core/apps.py
Normal file
@@ -0,0 +1,11 @@
|
|||||||
|
"""Impress Core application"""
|
||||||
|
# from django.apps import AppConfig
|
||||||
|
# from django.utils.translation import gettext_lazy as _
|
||||||
|
|
||||||
|
|
||||||
|
# class CoreConfig(AppConfig):
|
||||||
|
# """Configuration class for the impress core app."""
|
||||||
|
|
||||||
|
# name = "core"
|
||||||
|
# app_label = "core"
|
||||||
|
# verbose_name = _("impress core application")
|
||||||
0
src/backend/core/authentication/__init__.py
Normal file
0
src/backend/core/authentication/__init__.py
Normal file
100
src/backend/core/authentication/backends.py
Normal file
100
src/backend/core/authentication/backends.py
Normal file
@@ -0,0 +1,100 @@
|
|||||||
|
"""Authentication Backends for the Impress core app."""
|
||||||
|
|
||||||
|
from django.core.exceptions import SuspiciousOperation
|
||||||
|
from django.utils.translation import gettext_lazy as _
|
||||||
|
|
||||||
|
import requests
|
||||||
|
from mozilla_django_oidc.auth import (
|
||||||
|
OIDCAuthenticationBackend as MozillaOIDCAuthenticationBackend,
|
||||||
|
)
|
||||||
|
|
||||||
|
from core.models import User
|
||||||
|
|
||||||
|
|
||||||
|
class OIDCAuthenticationBackend(MozillaOIDCAuthenticationBackend):
|
||||||
|
"""Custom OpenID Connect (OIDC) Authentication Backend.
|
||||||
|
|
||||||
|
This class overrides the default OIDC Authentication Backend to accommodate differences
|
||||||
|
in the User and Identity models, and handles signed and/or encrypted UserInfo response.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def get_userinfo(self, access_token, id_token, payload):
|
||||||
|
"""Return user details dictionary.
|
||||||
|
|
||||||
|
Parameters:
|
||||||
|
- access_token (str): The access token.
|
||||||
|
- id_token (str): The id token (unused).
|
||||||
|
- payload (dict): The token payload (unused).
|
||||||
|
|
||||||
|
Note: The id_token and payload parameters are unused in this implementation,
|
||||||
|
but were kept to preserve base method signature.
|
||||||
|
|
||||||
|
Note: It handles signed and/or encrypted UserInfo Response. It is required by
|
||||||
|
Agent Connect, which follows the OIDC standard. It forces us to override the
|
||||||
|
base method, which deal with 'application/json' response.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
- dict: User details dictionary obtained from the OpenID Connect user endpoint.
|
||||||
|
"""
|
||||||
|
|
||||||
|
user_response = requests.get(
|
||||||
|
self.OIDC_OP_USER_ENDPOINT,
|
||||||
|
headers={"Authorization": f"Bearer {access_token}"},
|
||||||
|
verify=self.get_settings("OIDC_VERIFY_SSL", True),
|
||||||
|
timeout=self.get_settings("OIDC_TIMEOUT", None),
|
||||||
|
proxies=self.get_settings("OIDC_PROXY", None),
|
||||||
|
)
|
||||||
|
user_response.raise_for_status()
|
||||||
|
userinfo = self.verify_token(user_response.text)
|
||||||
|
return userinfo
|
||||||
|
|
||||||
|
def get_or_create_user(self, access_token, id_token, payload):
|
||||||
|
"""Return a User based on userinfo. Get or create a new user if no user matches the Sub.
|
||||||
|
|
||||||
|
Parameters:
|
||||||
|
- access_token (str): The access token.
|
||||||
|
- id_token (str): The ID token.
|
||||||
|
- payload (dict): The user payload.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
- User: An existing or newly created User instance.
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
- Exception: Raised when user creation is not allowed and no existing user is found.
|
||||||
|
"""
|
||||||
|
|
||||||
|
user_info = self.get_userinfo(access_token, id_token, payload)
|
||||||
|
sub = user_info.get("sub")
|
||||||
|
|
||||||
|
if sub is None:
|
||||||
|
raise SuspiciousOperation(
|
||||||
|
_("User info contained no recognizable user identification")
|
||||||
|
)
|
||||||
|
|
||||||
|
try:
|
||||||
|
user = User.objects.get(sub=sub)
|
||||||
|
except User.DoesNotExist:
|
||||||
|
if self.get_settings("OIDC_CREATE_USER", True):
|
||||||
|
user = self.create_user(user_info)
|
||||||
|
else:
|
||||||
|
user = None
|
||||||
|
|
||||||
|
return user
|
||||||
|
|
||||||
|
def create_user(self, claims):
|
||||||
|
"""Return a newly created User instance."""
|
||||||
|
|
||||||
|
sub = claims.get("sub")
|
||||||
|
|
||||||
|
if sub is None:
|
||||||
|
raise SuspiciousOperation(
|
||||||
|
_("Claims contained no recognizable user identification")
|
||||||
|
)
|
||||||
|
|
||||||
|
user = User.objects.create(
|
||||||
|
sub=sub,
|
||||||
|
email=claims.get("email"),
|
||||||
|
password="!", # noqa: S106
|
||||||
|
)
|
||||||
|
|
||||||
|
return user
|
||||||
18
src/backend/core/authentication/urls.py
Normal file
18
src/backend/core/authentication/urls.py
Normal file
@@ -0,0 +1,18 @@
|
|||||||
|
"""Authentication URLs for the People core app."""
|
||||||
|
|
||||||
|
from django.urls import path
|
||||||
|
|
||||||
|
from mozilla_django_oidc.urls import urlpatterns as mozzila_oidc_urls
|
||||||
|
|
||||||
|
from .views import OIDCLogoutCallbackView, OIDCLogoutView
|
||||||
|
|
||||||
|
urlpatterns = [
|
||||||
|
# Override the default 'logout/' path from Mozilla Django OIDC with our custom view.
|
||||||
|
path("logout/", OIDCLogoutView.as_view(), name="oidc_logout_custom"),
|
||||||
|
path(
|
||||||
|
"logout-callback/",
|
||||||
|
OIDCLogoutCallbackView.as_view(),
|
||||||
|
name="oidc_logout_callback",
|
||||||
|
),
|
||||||
|
*mozzila_oidc_urls,
|
||||||
|
]
|
||||||
137
src/backend/core/authentication/views.py
Normal file
137
src/backend/core/authentication/views.py
Normal file
@@ -0,0 +1,137 @@
|
|||||||
|
"""Authentication Views for the People core app."""
|
||||||
|
|
||||||
|
from urllib.parse import urlencode
|
||||||
|
|
||||||
|
from django.contrib import auth
|
||||||
|
from django.core.exceptions import SuspiciousOperation
|
||||||
|
from django.http import HttpResponseRedirect
|
||||||
|
from django.urls import reverse
|
||||||
|
from django.utils import crypto
|
||||||
|
|
||||||
|
from mozilla_django_oidc.utils import (
|
||||||
|
absolutify,
|
||||||
|
)
|
||||||
|
from mozilla_django_oidc.views import (
|
||||||
|
OIDCLogoutView as MozillaOIDCOIDCLogoutView,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class OIDCLogoutView(MozillaOIDCOIDCLogoutView):
|
||||||
|
"""Custom logout view for handling OpenID Connect (OIDC) logout flow.
|
||||||
|
|
||||||
|
Adds support for handling logout callbacks from the identity provider (OP)
|
||||||
|
by initiating the logout flow if the user has an active session.
|
||||||
|
|
||||||
|
The Django session is retained during the logout process to persist the 'state' OIDC parameter.
|
||||||
|
This parameter is crucial for maintaining the integrity of the logout flow between this call
|
||||||
|
and the subsequent callback.
|
||||||
|
"""
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def persist_state(request, state):
|
||||||
|
"""Persist the given 'state' parameter in the session's 'oidc_states' dictionary
|
||||||
|
|
||||||
|
This method is used to store the OIDC state parameter in the session, according to the
|
||||||
|
structure expected by Mozilla Django OIDC's 'add_state_and_verifier_and_nonce_to_session'
|
||||||
|
utility function.
|
||||||
|
"""
|
||||||
|
|
||||||
|
if "oidc_states" not in request.session or not isinstance(
|
||||||
|
request.session["oidc_states"], dict
|
||||||
|
):
|
||||||
|
request.session["oidc_states"] = {}
|
||||||
|
|
||||||
|
request.session["oidc_states"][state] = {}
|
||||||
|
request.session.save()
|
||||||
|
|
||||||
|
def construct_oidc_logout_url(self, request):
|
||||||
|
"""Create the redirect URL for interfacing with the OIDC provider.
|
||||||
|
|
||||||
|
Retrieves the necessary parameters from the session and constructs the URL
|
||||||
|
required to initiate logout with the OpenID Connect provider.
|
||||||
|
|
||||||
|
If no ID token is found in the session, the logout flow will not be initiated,
|
||||||
|
and the method will return the default redirect URL.
|
||||||
|
|
||||||
|
The 'state' parameter is generated randomly and persisted in the session to ensure
|
||||||
|
its integrity during the subsequent callback.
|
||||||
|
"""
|
||||||
|
|
||||||
|
oidc_logout_endpoint = self.get_settings("OIDC_OP_LOGOUT_ENDPOINT")
|
||||||
|
|
||||||
|
if not oidc_logout_endpoint:
|
||||||
|
return self.redirect_url
|
||||||
|
|
||||||
|
reverse_url = reverse("oidc_logout_callback")
|
||||||
|
id_token = request.session.get("oidc_id_token", None)
|
||||||
|
|
||||||
|
if not id_token:
|
||||||
|
return self.redirect_url
|
||||||
|
|
||||||
|
query = {
|
||||||
|
"id_token_hint": id_token,
|
||||||
|
"state": crypto.get_random_string(self.get_settings("OIDC_STATE_SIZE", 32)),
|
||||||
|
"post_logout_redirect_uri": absolutify(request, reverse_url),
|
||||||
|
}
|
||||||
|
|
||||||
|
self.persist_state(request, query["state"])
|
||||||
|
|
||||||
|
return f"{oidc_logout_endpoint}?{urlencode(query)}"
|
||||||
|
|
||||||
|
def post(self, request):
|
||||||
|
"""Handle user logout.
|
||||||
|
|
||||||
|
If the user is not authenticated, redirects to the default logout URL.
|
||||||
|
Otherwise, constructs the OIDC logout URL and redirects the user to start
|
||||||
|
the logout process.
|
||||||
|
|
||||||
|
If the user is redirected to the default logout URL, ensure her Django session
|
||||||
|
is terminated.
|
||||||
|
"""
|
||||||
|
|
||||||
|
logout_url = self.redirect_url
|
||||||
|
|
||||||
|
if request.user.is_authenticated:
|
||||||
|
logout_url = self.construct_oidc_logout_url(request)
|
||||||
|
|
||||||
|
# If the user is not redirected to the OIDC provider, ensure logout
|
||||||
|
if logout_url == self.redirect_url:
|
||||||
|
auth.logout(request)
|
||||||
|
|
||||||
|
return HttpResponseRedirect(logout_url)
|
||||||
|
|
||||||
|
|
||||||
|
class OIDCLogoutCallbackView(MozillaOIDCOIDCLogoutView):
|
||||||
|
"""Custom view for handling the logout callback from the OpenID Connect (OIDC) provider.
|
||||||
|
|
||||||
|
Handles the callback after logout from the identity provider (OP).
|
||||||
|
Verifies the state parameter and performs necessary logout actions.
|
||||||
|
|
||||||
|
The Django session is maintained during the logout process to ensure the integrity
|
||||||
|
of the logout flow initiated in the previous step.
|
||||||
|
"""
|
||||||
|
|
||||||
|
http_method_names = ["get"]
|
||||||
|
|
||||||
|
def get(self, request):
|
||||||
|
"""Handle the logout callback.
|
||||||
|
|
||||||
|
If the user is not authenticated, redirects to the default logout URL.
|
||||||
|
Otherwise, verifies the state parameter and performs necessary logout actions.
|
||||||
|
"""
|
||||||
|
|
||||||
|
if not request.user.is_authenticated:
|
||||||
|
return HttpResponseRedirect(self.redirect_url)
|
||||||
|
|
||||||
|
state = request.GET.get("state")
|
||||||
|
|
||||||
|
if state not in request.session.get("oidc_states", {}):
|
||||||
|
msg = "OIDC callback state not found in session `oidc_states`!"
|
||||||
|
raise SuspiciousOperation(msg)
|
||||||
|
|
||||||
|
del request.session["oidc_states"][state]
|
||||||
|
request.session.save()
|
||||||
|
|
||||||
|
auth.logout(request)
|
||||||
|
|
||||||
|
return HttpResponseRedirect(self.redirect_url)
|
||||||
15
src/backend/core/enums.py
Normal file
15
src/backend/core/enums.py
Normal file
@@ -0,0 +1,15 @@
|
|||||||
|
"""
|
||||||
|
Core application enums declaration
|
||||||
|
"""
|
||||||
|
from django.conf import global_settings, settings
|
||||||
|
from django.utils.translation import gettext_lazy as _
|
||||||
|
|
||||||
|
# Django sets `LANGUAGES` by default with all supported languages. We can use it for
|
||||||
|
# the choice of languages which should not be limited to the few languages active in
|
||||||
|
# the app.
|
||||||
|
# pylint: disable=no-member
|
||||||
|
ALL_LANGUAGES = getattr(
|
||||||
|
settings,
|
||||||
|
"ALL_LANGUAGES",
|
||||||
|
[(language, _(name)) for language, name in global_settings.LANGUAGES],
|
||||||
|
)
|
||||||
25
src/backend/core/factories.py
Normal file
25
src/backend/core/factories.py
Normal file
@@ -0,0 +1,25 @@
|
|||||||
|
# ruff: noqa: S311
|
||||||
|
"""
|
||||||
|
Core application factories
|
||||||
|
"""
|
||||||
|
from django.conf import settings
|
||||||
|
from django.contrib.auth.hashers import make_password
|
||||||
|
|
||||||
|
import factory.fuzzy
|
||||||
|
from faker import Faker
|
||||||
|
|
||||||
|
from core import models
|
||||||
|
|
||||||
|
fake = Faker()
|
||||||
|
|
||||||
|
|
||||||
|
class UserFactory(factory.django.DjangoModelFactory):
|
||||||
|
"""A factory to random users for testing purposes."""
|
||||||
|
|
||||||
|
class Meta:
|
||||||
|
model = models.User
|
||||||
|
|
||||||
|
sub = factory.Sequence(lambda n: f"user{n!s}")
|
||||||
|
email = factory.Faker("email")
|
||||||
|
language = factory.fuzzy.FuzzyChoice([lang[0] for lang in settings.LANGUAGES])
|
||||||
|
password = make_password("password")
|
||||||
166
src/backend/core/migrations/0001_initial.py
Normal file
166
src/backend/core/migrations/0001_initial.py
Normal file
@@ -0,0 +1,166 @@
|
|||||||
|
# Generated by Django 5.0.3 on 2024-05-28 20:29
|
||||||
|
|
||||||
|
import django.contrib.auth.models
|
||||||
|
import django.core.validators
|
||||||
|
import django.db.models.deletion
|
||||||
|
import timezone_field.fields
|
||||||
|
import uuid
|
||||||
|
from django.conf import settings
|
||||||
|
from django.db import migrations, models
|
||||||
|
|
||||||
|
|
||||||
|
class Migration(migrations.Migration):
|
||||||
|
|
||||||
|
initial = True
|
||||||
|
|
||||||
|
dependencies = [
|
||||||
|
('auth', '0012_alter_user_first_name_max_length'),
|
||||||
|
]
|
||||||
|
|
||||||
|
operations = [
|
||||||
|
migrations.CreateModel(
|
||||||
|
name='Document',
|
||||||
|
fields=[
|
||||||
|
('id', models.UUIDField(default=uuid.uuid4, editable=False, help_text='primary key for the record as UUID', primary_key=True, serialize=False, verbose_name='id')),
|
||||||
|
('created_at', models.DateTimeField(auto_now_add=True, help_text='date and time at which a record was created', verbose_name='created on')),
|
||||||
|
('updated_at', models.DateTimeField(auto_now=True, help_text='date and time at which a record was last updated', verbose_name='updated on')),
|
||||||
|
('title', models.CharField(max_length=255, verbose_name='title')),
|
||||||
|
('is_public', models.BooleanField(default=False, help_text='Whether this document is public for anyone to use.', verbose_name='public')),
|
||||||
|
],
|
||||||
|
options={
|
||||||
|
'verbose_name': 'Document',
|
||||||
|
'verbose_name_plural': 'Documents',
|
||||||
|
'db_table': 'impress_document',
|
||||||
|
'ordering': ('title',),
|
||||||
|
},
|
||||||
|
),
|
||||||
|
migrations.CreateModel(
|
||||||
|
name='Template',
|
||||||
|
fields=[
|
||||||
|
('id', models.UUIDField(default=uuid.uuid4, editable=False, help_text='primary key for the record as UUID', primary_key=True, serialize=False, verbose_name='id')),
|
||||||
|
('created_at', models.DateTimeField(auto_now_add=True, help_text='date and time at which a record was created', verbose_name='created on')),
|
||||||
|
('updated_at', models.DateTimeField(auto_now=True, help_text='date and time at which a record was last updated', verbose_name='updated on')),
|
||||||
|
('title', models.CharField(max_length=255, verbose_name='title')),
|
||||||
|
('description', models.TextField(blank=True, verbose_name='description')),
|
||||||
|
('code', models.TextField(blank=True, verbose_name='code')),
|
||||||
|
('css', models.TextField(blank=True, verbose_name='css')),
|
||||||
|
('is_public', models.BooleanField(default=False, help_text='Whether this template is public for anyone to use.', verbose_name='public')),
|
||||||
|
],
|
||||||
|
options={
|
||||||
|
'verbose_name': 'Template',
|
||||||
|
'verbose_name_plural': 'Templates',
|
||||||
|
'db_table': 'impress_template',
|
||||||
|
'ordering': ('title',),
|
||||||
|
},
|
||||||
|
),
|
||||||
|
migrations.CreateModel(
|
||||||
|
name='User',
|
||||||
|
fields=[
|
||||||
|
('password', models.CharField(max_length=128, verbose_name='password')),
|
||||||
|
('last_login', models.DateTimeField(blank=True, null=True, verbose_name='last login')),
|
||||||
|
('is_superuser', models.BooleanField(default=False, help_text='Designates that this user has all permissions without explicitly assigning them.', verbose_name='superuser status')),
|
||||||
|
('id', models.UUIDField(default=uuid.uuid4, editable=False, help_text='primary key for the record as UUID', primary_key=True, serialize=False, verbose_name='id')),
|
||||||
|
('created_at', models.DateTimeField(auto_now_add=True, help_text='date and time at which a record was created', verbose_name='created on')),
|
||||||
|
('updated_at', models.DateTimeField(auto_now=True, help_text='date and time at which a record was last updated', verbose_name='updated on')),
|
||||||
|
('sub', models.CharField(blank=True, help_text='Required. 255 characters or fewer. Letters, numbers, and @/./+/-/_ characters only.', max_length=255, null=True, unique=True, validators=[django.core.validators.RegexValidator(message='Enter a valid sub. This value may contain only letters, numbers, and @/./+/-/_ characters.', regex='^[\\w.@+-]+\\Z')], verbose_name='sub')),
|
||||||
|
('email', models.EmailField(blank=True, max_length=254, null=True, verbose_name='identity email address')),
|
||||||
|
('admin_email', models.EmailField(blank=True, max_length=254, null=True, unique=True, verbose_name='admin email address')),
|
||||||
|
('language', models.CharField(choices="(('en-us', 'English'), ('fr-fr', 'French'))", default='en-us', help_text='The language in which the user wants to see the interface.', max_length=10, verbose_name='language')),
|
||||||
|
('timezone', timezone_field.fields.TimeZoneField(choices_display='WITH_GMT_OFFSET', default='UTC', help_text='The timezone in which the user wants to see times.', use_pytz=False)),
|
||||||
|
('is_device', models.BooleanField(default=False, help_text='Whether the user is a device or a real user.', verbose_name='device')),
|
||||||
|
('is_staff', models.BooleanField(default=False, help_text='Whether the user can log into this admin site.', verbose_name='staff status')),
|
||||||
|
('is_active', models.BooleanField(default=True, help_text='Whether this user should be treated as active. Unselect this instead of deleting accounts.', verbose_name='active')),
|
||||||
|
('groups', models.ManyToManyField(blank=True, help_text='The groups this user belongs to. A user will get all permissions granted to each of their groups.', related_name='user_set', related_query_name='user', to='auth.group', verbose_name='groups')),
|
||||||
|
('user_permissions', models.ManyToManyField(blank=True, help_text='Specific permissions for this user.', related_name='user_set', related_query_name='user', to='auth.permission', verbose_name='user permissions')),
|
||||||
|
],
|
||||||
|
options={
|
||||||
|
'verbose_name': 'user',
|
||||||
|
'verbose_name_plural': 'users',
|
||||||
|
'db_table': 'impress_user',
|
||||||
|
},
|
||||||
|
managers=[
|
||||||
|
('objects', django.contrib.auth.models.UserManager()),
|
||||||
|
],
|
||||||
|
),
|
||||||
|
migrations.CreateModel(
|
||||||
|
name='DocumentAccess',
|
||||||
|
fields=[
|
||||||
|
('id', models.UUIDField(default=uuid.uuid4, editable=False, help_text='primary key for the record as UUID', primary_key=True, serialize=False, verbose_name='id')),
|
||||||
|
('created_at', models.DateTimeField(auto_now_add=True, help_text='date and time at which a record was created', verbose_name='created on')),
|
||||||
|
('updated_at', models.DateTimeField(auto_now=True, help_text='date and time at which a record was last updated', verbose_name='updated on')),
|
||||||
|
('team', models.CharField(blank=True, max_length=100)),
|
||||||
|
('role', models.CharField(choices=[('reader', 'Reader'), ('editor', 'Editor'), ('administrator', 'Administrator'), ('owner', 'Owner')], default='reader', max_length=20)),
|
||||||
|
('document', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='accesses', to='core.document')),
|
||||||
|
('user', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, to=settings.AUTH_USER_MODEL)),
|
||||||
|
],
|
||||||
|
options={
|
||||||
|
'verbose_name': 'Document/user relation',
|
||||||
|
'verbose_name_plural': 'Document/user relations',
|
||||||
|
'db_table': 'impress_document_access',
|
||||||
|
'ordering': ('-created_at',),
|
||||||
|
},
|
||||||
|
),
|
||||||
|
migrations.CreateModel(
|
||||||
|
name='Invitation',
|
||||||
|
fields=[
|
||||||
|
('id', models.UUIDField(default=uuid.uuid4, editable=False, help_text='primary key for the record as UUID', primary_key=True, serialize=False, verbose_name='id')),
|
||||||
|
('created_at', models.DateTimeField(auto_now_add=True, help_text='date and time at which a record was created', verbose_name='created on')),
|
||||||
|
('updated_at', models.DateTimeField(auto_now=True, help_text='date and time at which a record was last updated', verbose_name='updated on')),
|
||||||
|
('email', models.EmailField(max_length=254, verbose_name='email address')),
|
||||||
|
('role', models.CharField(choices=[('reader', 'Reader'), ('editor', 'Editor'), ('administrator', 'Administrator'), ('owner', 'Owner')], default='reader', max_length=20)),
|
||||||
|
('document', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='invitations', to='core.document')),
|
||||||
|
('issuer', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='invitations', to=settings.AUTH_USER_MODEL)),
|
||||||
|
],
|
||||||
|
options={
|
||||||
|
'verbose_name': 'Document invitation',
|
||||||
|
'verbose_name_plural': 'Document invitations',
|
||||||
|
'db_table': 'impress_invitation',
|
||||||
|
},
|
||||||
|
),
|
||||||
|
migrations.CreateModel(
|
||||||
|
name='TemplateAccess',
|
||||||
|
fields=[
|
||||||
|
('id', models.UUIDField(default=uuid.uuid4, editable=False, help_text='primary key for the record as UUID', primary_key=True, serialize=False, verbose_name='id')),
|
||||||
|
('created_at', models.DateTimeField(auto_now_add=True, help_text='date and time at which a record was created', verbose_name='created on')),
|
||||||
|
('updated_at', models.DateTimeField(auto_now=True, help_text='date and time at which a record was last updated', verbose_name='updated on')),
|
||||||
|
('team', models.CharField(blank=True, max_length=100)),
|
||||||
|
('role', models.CharField(choices=[('reader', 'Reader'), ('editor', 'Editor'), ('administrator', 'Administrator'), ('owner', 'Owner')], default='reader', max_length=20)),
|
||||||
|
('template', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='accesses', to='core.template')),
|
||||||
|
('user', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, to=settings.AUTH_USER_MODEL)),
|
||||||
|
],
|
||||||
|
options={
|
||||||
|
'verbose_name': 'Template/user relation',
|
||||||
|
'verbose_name_plural': 'Template/user relations',
|
||||||
|
'db_table': 'impress_template_access',
|
||||||
|
'ordering': ('-created_at',),
|
||||||
|
},
|
||||||
|
),
|
||||||
|
migrations.AddConstraint(
|
||||||
|
model_name='documentaccess',
|
||||||
|
constraint=models.UniqueConstraint(condition=models.Q(('user__isnull', False)), fields=('user', 'document'), name='unique_document_user', violation_error_message='This user is already in this document.'),
|
||||||
|
),
|
||||||
|
migrations.AddConstraint(
|
||||||
|
model_name='documentaccess',
|
||||||
|
constraint=models.UniqueConstraint(condition=models.Q(('team__gt', '')), fields=('team', 'document'), name='unique_document_team', violation_error_message='This team is already in this document.'),
|
||||||
|
),
|
||||||
|
migrations.AddConstraint(
|
||||||
|
model_name='documentaccess',
|
||||||
|
constraint=models.CheckConstraint(check=models.Q(models.Q(('team', ''), ('user__isnull', False)), models.Q(('team__gt', ''), ('user__isnull', True)), _connector='OR'), name='check_document_access_either_user_or_team', violation_error_message='Either user or team must be set, not both.'),
|
||||||
|
),
|
||||||
|
migrations.AddConstraint(
|
||||||
|
model_name='invitation',
|
||||||
|
constraint=models.UniqueConstraint(fields=('email', 'document'), name='email_and_document_unique_together'),
|
||||||
|
),
|
||||||
|
migrations.AddConstraint(
|
||||||
|
model_name='templateaccess',
|
||||||
|
constraint=models.UniqueConstraint(condition=models.Q(('user__isnull', False)), fields=('user', 'template'), name='unique_template_user', violation_error_message='This user is already in this template.'),
|
||||||
|
),
|
||||||
|
migrations.AddConstraint(
|
||||||
|
model_name='templateaccess',
|
||||||
|
constraint=models.UniqueConstraint(condition=models.Q(('team__gt', '')), fields=('team', 'template'), name='unique_template_team', violation_error_message='This team is already in this template.'),
|
||||||
|
),
|
||||||
|
migrations.AddConstraint(
|
||||||
|
model_name='templateaccess',
|
||||||
|
constraint=models.CheckConstraint(check=models.Q(models.Q(('team', ''), ('user__isnull', False)), models.Q(('team__gt', ''), ('user__isnull', True)), _connector='OR'), name='check_template_access_either_user_or_team', violation_error_message='Either user or team must be set, not both.'),
|
||||||
|
),
|
||||||
|
]
|
||||||
14
src/backend/core/migrations/0002_create_pg_trgm_extension.py
Normal file
14
src/backend/core/migrations/0002_create_pg_trgm_extension.py
Normal file
@@ -0,0 +1,14 @@
|
|||||||
|
from django.db import migrations
|
||||||
|
|
||||||
|
class Migration(migrations.Migration):
|
||||||
|
|
||||||
|
dependencies = [
|
||||||
|
('core', '0001_initial'),
|
||||||
|
]
|
||||||
|
|
||||||
|
operations = [
|
||||||
|
migrations.RunSQL(
|
||||||
|
"CREATE EXTENSION IF NOT EXISTS pg_trgm;",
|
||||||
|
reverse_sql="DROP EXTENSION IF EXISTS pg_trgm;",
|
||||||
|
),
|
||||||
|
]
|
||||||
0
src/backend/core/migrations/__init__.py
Normal file
0
src/backend/core/migrations/__init__.py
Normal file
143
src/backend/core/models.py
Normal file
143
src/backend/core/models.py
Normal file
@@ -0,0 +1,143 @@
|
|||||||
|
"""
|
||||||
|
Declare and configure the models for the impress core application
|
||||||
|
"""
|
||||||
|
import uuid
|
||||||
|
from logging import getLogger
|
||||||
|
|
||||||
|
from django.conf import settings
|
||||||
|
from django.contrib.auth import models as auth_models
|
||||||
|
from django.contrib.auth.base_user import AbstractBaseUser
|
||||||
|
from django.core import mail, validators
|
||||||
|
from django.db import models
|
||||||
|
from django.utils.functional import lazy
|
||||||
|
from django.utils.translation import gettext_lazy as _
|
||||||
|
|
||||||
|
from timezone_field import TimeZoneField
|
||||||
|
|
||||||
|
logger = getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
class BaseModel(models.Model):
|
||||||
|
"""
|
||||||
|
Serves as an abstract base model for other models, ensuring that records are validated
|
||||||
|
before saving as Django doesn't do it by default.
|
||||||
|
|
||||||
|
Includes fields common to all models: a UUID primary key and creation/update timestamps.
|
||||||
|
"""
|
||||||
|
|
||||||
|
id = models.UUIDField(
|
||||||
|
verbose_name=_("id"),
|
||||||
|
help_text=_("primary key for the record as UUID"),
|
||||||
|
primary_key=True,
|
||||||
|
default=uuid.uuid4,
|
||||||
|
editable=False,
|
||||||
|
)
|
||||||
|
created_at = models.DateTimeField(
|
||||||
|
verbose_name=_("created on"),
|
||||||
|
help_text=_("date and time at which a record was created"),
|
||||||
|
auto_now_add=True,
|
||||||
|
editable=False,
|
||||||
|
)
|
||||||
|
updated_at = models.DateTimeField(
|
||||||
|
verbose_name=_("updated on"),
|
||||||
|
help_text=_("date and time at which a record was last updated"),
|
||||||
|
auto_now=True,
|
||||||
|
editable=False,
|
||||||
|
)
|
||||||
|
|
||||||
|
class Meta:
|
||||||
|
abstract = True
|
||||||
|
|
||||||
|
def save(self, *args, **kwargs):
|
||||||
|
"""Call `full_clean` before saving."""
|
||||||
|
self.full_clean()
|
||||||
|
super().save(*args, **kwargs)
|
||||||
|
|
||||||
|
|
||||||
|
class User(AbstractBaseUser, BaseModel, auth_models.PermissionsMixin):
|
||||||
|
"""User model to work with OIDC only authentication."""
|
||||||
|
|
||||||
|
sub_validator = validators.RegexValidator(
|
||||||
|
regex=r"^[\w.@+-]+\Z",
|
||||||
|
message=_(
|
||||||
|
"Enter a valid sub. This value may contain only letters, "
|
||||||
|
"numbers, and @/./+/-/_ characters."
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
sub = models.CharField(
|
||||||
|
_("sub"),
|
||||||
|
help_text=_(
|
||||||
|
"Required. 255 characters or fewer. Letters, numbers, and @/./+/-/_ characters only."
|
||||||
|
),
|
||||||
|
max_length=255,
|
||||||
|
unique=True,
|
||||||
|
validators=[sub_validator],
|
||||||
|
blank=True,
|
||||||
|
null=True,
|
||||||
|
)
|
||||||
|
email = models.EmailField(_("identity email address"), blank=True, null=True)
|
||||||
|
|
||||||
|
# Unlike the "email" field which stores the email coming from the OIDC token, this field
|
||||||
|
# stores the email used by staff users to login to the admin site
|
||||||
|
admin_email = models.EmailField(
|
||||||
|
_("admin email address"), unique=True, blank=True, null=True
|
||||||
|
)
|
||||||
|
|
||||||
|
language = models.CharField(
|
||||||
|
max_length=10,
|
||||||
|
choices=lazy(lambda: settings.LANGUAGES, tuple)(),
|
||||||
|
default=settings.LANGUAGE_CODE,
|
||||||
|
verbose_name=_("language"),
|
||||||
|
help_text=_("The language in which the user wants to see the interface."),
|
||||||
|
)
|
||||||
|
timezone = TimeZoneField(
|
||||||
|
choices_display="WITH_GMT_OFFSET",
|
||||||
|
use_pytz=False,
|
||||||
|
default=settings.TIME_ZONE,
|
||||||
|
help_text=_("The timezone in which the user wants to see times."),
|
||||||
|
)
|
||||||
|
is_device = models.BooleanField(
|
||||||
|
_("device"),
|
||||||
|
default=False,
|
||||||
|
help_text=_("Whether the user is a device or a real user."),
|
||||||
|
)
|
||||||
|
is_staff = models.BooleanField(
|
||||||
|
_("staff status"),
|
||||||
|
default=False,
|
||||||
|
help_text=_("Whether the user can log into this admin site."),
|
||||||
|
)
|
||||||
|
is_active = models.BooleanField(
|
||||||
|
_("active"),
|
||||||
|
default=True,
|
||||||
|
help_text=_(
|
||||||
|
"Whether this user should be treated as active. "
|
||||||
|
"Unselect this instead of deleting accounts."
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
objects = auth_models.UserManager()
|
||||||
|
|
||||||
|
USERNAME_FIELD = "admin_email"
|
||||||
|
REQUIRED_FIELDS = []
|
||||||
|
|
||||||
|
class Meta:
|
||||||
|
db_table = "impress_user"
|
||||||
|
verbose_name = _("user")
|
||||||
|
verbose_name_plural = _("users")
|
||||||
|
|
||||||
|
def __str__(self):
|
||||||
|
return self.email or self.admin_email or str(self.id)
|
||||||
|
|
||||||
|
def email_user(self, subject, message, from_email=None, **kwargs):
|
||||||
|
"""Email this user."""
|
||||||
|
if not self.email:
|
||||||
|
raise ValueError("User has no email address.")
|
||||||
|
mail.send_mail(subject, message, from_email, [self.email], **kwargs)
|
||||||
|
|
||||||
|
def get_teams(self):
|
||||||
|
"""
|
||||||
|
Get list of teams in which the user is, as a list of strings.
|
||||||
|
Must be cached if retrieved remotely.
|
||||||
|
"""
|
||||||
|
return []
|
||||||
BIN
src/backend/core/static/images/logo-suite-numerique.png
Normal file
BIN
src/backend/core/static/images/logo-suite-numerique.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 13 KiB |
BIN
src/backend/core/static/images/logo.png
Normal file
BIN
src/backend/core/static/images/logo.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 1.2 KiB |
BIN
src/backend/core/static/images/mail-header-background.png
Normal file
BIN
src/backend/core/static/images/mail-header-background.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 40 KiB |
14
src/backend/core/templates/core/generate_document.html
Normal file
14
src/backend/core/templates/core/generate_document.html
Normal file
@@ -0,0 +1,14 @@
|
|||||||
|
<!DOCTYPE html>
|
||||||
|
<html>
|
||||||
|
<head>
|
||||||
|
<title>Generate Document</title>
|
||||||
|
</head>
|
||||||
|
<body>
|
||||||
|
<h2>Generate Document</h2>
|
||||||
|
<form method="post" enctype="multipart/form-data">
|
||||||
|
{% csrf_token %}
|
||||||
|
{{ form.as_p }}
|
||||||
|
<button type="submit">Generate PDF</button>
|
||||||
|
</form>
|
||||||
|
</body>
|
||||||
|
</html>
|
||||||
0
src/backend/core/templatetags/__init__.py
Normal file
0
src/backend/core/templatetags/__init__.py
Normal file
58
src/backend/core/templatetags/extra_tags.py
Normal file
58
src/backend/core/templatetags/extra_tags.py
Normal file
@@ -0,0 +1,58 @@
|
|||||||
|
"""Custom template tags for the core application of People."""
|
||||||
|
|
||||||
|
import base64
|
||||||
|
|
||||||
|
from django import template
|
||||||
|
from django.contrib.staticfiles import finders
|
||||||
|
|
||||||
|
from PIL import ImageFile as PillowImageFile
|
||||||
|
|
||||||
|
register = template.Library()
|
||||||
|
|
||||||
|
|
||||||
|
def image_to_base64(file_or_path, close=False):
|
||||||
|
"""
|
||||||
|
Return the src string of the base64 encoding of an image represented by its path
|
||||||
|
or file opened or not.
|
||||||
|
|
||||||
|
Inspired by Django's "get_image_dimensions"
|
||||||
|
"""
|
||||||
|
pil_parser = PillowImageFile.Parser()
|
||||||
|
if hasattr(file_or_path, "read"):
|
||||||
|
file = file_or_path
|
||||||
|
if file.closed and hasattr(file, "open"):
|
||||||
|
file_or_path.open()
|
||||||
|
file_pos = file.tell()
|
||||||
|
file.seek(0)
|
||||||
|
else:
|
||||||
|
try:
|
||||||
|
# pylint: disable=consider-using-with
|
||||||
|
file = open(file_or_path, "rb")
|
||||||
|
except OSError:
|
||||||
|
return ""
|
||||||
|
close = True
|
||||||
|
|
||||||
|
try:
|
||||||
|
image_data = file.read()
|
||||||
|
if not image_data:
|
||||||
|
return ""
|
||||||
|
pil_parser.feed(image_data)
|
||||||
|
if pil_parser.image:
|
||||||
|
mime_type = pil_parser.image.get_format_mimetype()
|
||||||
|
encoded_string = base64.b64encode(image_data)
|
||||||
|
return f"data:{mime_type:s};base64, {encoded_string.decode('utf-8'):s}"
|
||||||
|
return ""
|
||||||
|
finally:
|
||||||
|
if close:
|
||||||
|
file.close()
|
||||||
|
else:
|
||||||
|
file.seek(file_pos)
|
||||||
|
|
||||||
|
|
||||||
|
@register.simple_tag
|
||||||
|
def base64_static(path):
|
||||||
|
"""Return a static file into a base64."""
|
||||||
|
full_path = finders.find(path)
|
||||||
|
if full_path:
|
||||||
|
return image_to_base64(full_path, True)
|
||||||
|
return ""
|
||||||
0
src/backend/core/tests/__init__.py
Normal file
0
src/backend/core/tests/__init__.py
Normal file
0
src/backend/core/tests/authentication/__init__.py
Normal file
0
src/backend/core/tests/authentication/__init__.py
Normal file
101
src/backend/core/tests/authentication/test_backends.py
Normal file
101
src/backend/core/tests/authentication/test_backends.py
Normal file
@@ -0,0 +1,101 @@
|
|||||||
|
"""Unit tests for the Authentication Backends."""
|
||||||
|
|
||||||
|
from django.core.exceptions import SuspiciousOperation
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
|
||||||
|
from core import models
|
||||||
|
from core.authentication.backends import OIDCAuthenticationBackend
|
||||||
|
from core.factories import UserFactory
|
||||||
|
|
||||||
|
pytestmark = pytest.mark.django_db
|
||||||
|
|
||||||
|
|
||||||
|
def test_authentication_getter_existing_user_no_email(
|
||||||
|
django_assert_num_queries, monkeypatch
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
If an existing user matches the user's info sub, the user should be returned.
|
||||||
|
"""
|
||||||
|
|
||||||
|
klass = OIDCAuthenticationBackend()
|
||||||
|
db_user = UserFactory()
|
||||||
|
|
||||||
|
def get_userinfo_mocked(*args):
|
||||||
|
return {"sub": db_user.sub}
|
||||||
|
|
||||||
|
monkeypatch.setattr(OIDCAuthenticationBackend, "get_userinfo", get_userinfo_mocked)
|
||||||
|
|
||||||
|
with django_assert_num_queries(1):
|
||||||
|
user = klass.get_or_create_user(
|
||||||
|
access_token="test-token", id_token=None, payload=None
|
||||||
|
)
|
||||||
|
|
||||||
|
assert user == db_user
|
||||||
|
|
||||||
|
|
||||||
|
def test_authentication_getter_new_user_no_email(monkeypatch):
|
||||||
|
"""
|
||||||
|
If no user matches the user's info sub, a user should be created.
|
||||||
|
User's info doesn't contain an email, created user's email should be empty.
|
||||||
|
"""
|
||||||
|
klass = OIDCAuthenticationBackend()
|
||||||
|
|
||||||
|
def get_userinfo_mocked(*args):
|
||||||
|
return {"sub": "123"}
|
||||||
|
|
||||||
|
monkeypatch.setattr(OIDCAuthenticationBackend, "get_userinfo", get_userinfo_mocked)
|
||||||
|
|
||||||
|
user = klass.get_or_create_user(
|
||||||
|
access_token="test-token", id_token=None, payload=None
|
||||||
|
)
|
||||||
|
|
||||||
|
assert user.sub == "123"
|
||||||
|
assert user.email is None
|
||||||
|
assert user.password == "!"
|
||||||
|
assert models.User.objects.count() == 1
|
||||||
|
|
||||||
|
|
||||||
|
def test_authentication_getter_new_user_with_email(monkeypatch):
|
||||||
|
"""
|
||||||
|
If no user matches the user's info sub, a user should be created.
|
||||||
|
User's email and name should be set on the identity.
|
||||||
|
The "email" field on the User model should not be set as it is reserved for staff users.
|
||||||
|
"""
|
||||||
|
klass = OIDCAuthenticationBackend()
|
||||||
|
|
||||||
|
email = "impress@example.com"
|
||||||
|
|
||||||
|
def get_userinfo_mocked(*args):
|
||||||
|
return {"sub": "123", "email": email, "first_name": "John", "last_name": "Doe"}
|
||||||
|
|
||||||
|
monkeypatch.setattr(OIDCAuthenticationBackend, "get_userinfo", get_userinfo_mocked)
|
||||||
|
|
||||||
|
user = klass.get_or_create_user(
|
||||||
|
access_token="test-token", id_token=None, payload=None
|
||||||
|
)
|
||||||
|
|
||||||
|
assert user.sub == "123"
|
||||||
|
assert user.email == email
|
||||||
|
assert user.password == "!"
|
||||||
|
assert models.User.objects.count() == 1
|
||||||
|
|
||||||
|
|
||||||
|
def test_models_oidc_user_getter_invalid_token(django_assert_num_queries, monkeypatch):
|
||||||
|
"""The user's info doesn't contain a sub."""
|
||||||
|
klass = OIDCAuthenticationBackend()
|
||||||
|
|
||||||
|
def get_userinfo_mocked(*args):
|
||||||
|
return {
|
||||||
|
"test": "123",
|
||||||
|
}
|
||||||
|
|
||||||
|
monkeypatch.setattr(OIDCAuthenticationBackend, "get_userinfo", get_userinfo_mocked)
|
||||||
|
|
||||||
|
with django_assert_num_queries(0), pytest.raises(
|
||||||
|
SuspiciousOperation,
|
||||||
|
match="User info contained no recognizable user identification",
|
||||||
|
):
|
||||||
|
klass.get_or_create_user(access_token="test-token", id_token=None, payload=None)
|
||||||
|
|
||||||
|
assert models.User.objects.exists() is False
|
||||||
10
src/backend/core/tests/authentication/test_urls.py
Normal file
10
src/backend/core/tests/authentication/test_urls.py
Normal file
@@ -0,0 +1,10 @@
|
|||||||
|
"""Unit tests for the Authentication URLs."""
|
||||||
|
|
||||||
|
from core.authentication.urls import urlpatterns
|
||||||
|
|
||||||
|
|
||||||
|
def test_urls_override_default_mozilla_django_oidc():
|
||||||
|
"""Custom URL patterns should override default ones from Mozilla Django OIDC."""
|
||||||
|
|
||||||
|
url_names = [u.name for u in urlpatterns]
|
||||||
|
assert url_names.index("oidc_logout_custom") < url_names.index("oidc_logout")
|
||||||
231
src/backend/core/tests/authentication/test_views.py
Normal file
231
src/backend/core/tests/authentication/test_views.py
Normal file
@@ -0,0 +1,231 @@
|
|||||||
|
"""Unit tests for the Authentication Views."""
|
||||||
|
|
||||||
|
from unittest import mock
|
||||||
|
from urllib.parse import parse_qs, urlparse
|
||||||
|
|
||||||
|
from django.contrib.auth.models import AnonymousUser
|
||||||
|
from django.contrib.sessions.middleware import SessionMiddleware
|
||||||
|
from django.core.exceptions import SuspiciousOperation
|
||||||
|
from django.test import RequestFactory
|
||||||
|
from django.test.utils import override_settings
|
||||||
|
from django.urls import reverse
|
||||||
|
from django.utils import crypto
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
from rest_framework.test import APIClient
|
||||||
|
|
||||||
|
from core import factories
|
||||||
|
from core.authentication.views import OIDCLogoutCallbackView, OIDCLogoutView
|
||||||
|
|
||||||
|
pytestmark = pytest.mark.django_db
|
||||||
|
|
||||||
|
|
||||||
|
@override_settings(LOGOUT_REDIRECT_URL="/example-logout")
|
||||||
|
def test_view_logout_anonymous():
|
||||||
|
"""Anonymous users calling the logout url,
|
||||||
|
should be redirected to the specified LOGOUT_REDIRECT_URL."""
|
||||||
|
|
||||||
|
url = reverse("oidc_logout_custom")
|
||||||
|
response = APIClient().get(url)
|
||||||
|
|
||||||
|
assert response.status_code == 302
|
||||||
|
assert response.url == "/example-logout"
|
||||||
|
|
||||||
|
|
||||||
|
@mock.patch.object(
|
||||||
|
OIDCLogoutView, "construct_oidc_logout_url", return_value="/example-logout"
|
||||||
|
)
|
||||||
|
def test_view_logout(mocked_oidc_logout_url):
|
||||||
|
"""Authenticated users should be redirected to OIDC provider for logout."""
|
||||||
|
|
||||||
|
user = factories.UserFactory()
|
||||||
|
|
||||||
|
client = APIClient()
|
||||||
|
client.force_login(user)
|
||||||
|
|
||||||
|
url = reverse("oidc_logout_custom")
|
||||||
|
response = client.get(url)
|
||||||
|
|
||||||
|
mocked_oidc_logout_url.assert_called_once()
|
||||||
|
|
||||||
|
assert response.status_code == 302
|
||||||
|
assert response.url == "/example-logout"
|
||||||
|
|
||||||
|
|
||||||
|
@override_settings(LOGOUT_REDIRECT_URL="/default-redirect-logout")
|
||||||
|
@mock.patch.object(
|
||||||
|
OIDCLogoutView, "construct_oidc_logout_url", return_value="/default-redirect-logout"
|
||||||
|
)
|
||||||
|
def test_view_logout_no_oidc_provider(mocked_oidc_logout_url):
|
||||||
|
"""Authenticated users should be logged out when no OIDC provider is available."""
|
||||||
|
|
||||||
|
user = factories.UserFactory()
|
||||||
|
|
||||||
|
client = APIClient()
|
||||||
|
client.force_login(user)
|
||||||
|
|
||||||
|
url = reverse("oidc_logout_custom")
|
||||||
|
|
||||||
|
with mock.patch("mozilla_django_oidc.views.auth.logout") as mock_logout:
|
||||||
|
response = client.get(url)
|
||||||
|
mocked_oidc_logout_url.assert_called_once()
|
||||||
|
mock_logout.assert_called_once()
|
||||||
|
|
||||||
|
assert response.status_code == 302
|
||||||
|
assert response.url == "/default-redirect-logout"
|
||||||
|
|
||||||
|
|
||||||
|
@override_settings(LOGOUT_REDIRECT_URL="/example-logout")
|
||||||
|
def test_view_logout_callback_anonymous():
|
||||||
|
"""Anonymous users calling the logout callback url,
|
||||||
|
should be redirected to the specified LOGOUT_REDIRECT_URL."""
|
||||||
|
|
||||||
|
url = reverse("oidc_logout_callback")
|
||||||
|
response = APIClient().get(url)
|
||||||
|
|
||||||
|
assert response.status_code == 302
|
||||||
|
assert response.url == "/example-logout"
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.parametrize(
|
||||||
|
"initial_oidc_states",
|
||||||
|
[{}, {"other_state": "foo"}],
|
||||||
|
)
|
||||||
|
def test_view_logout_persist_state(initial_oidc_states):
|
||||||
|
"""State value should be persisted in session's data."""
|
||||||
|
|
||||||
|
user = factories.UserFactory()
|
||||||
|
|
||||||
|
request = RequestFactory().request()
|
||||||
|
request.user = user
|
||||||
|
|
||||||
|
middleware = SessionMiddleware(get_response=lambda x: x)
|
||||||
|
middleware.process_request(request)
|
||||||
|
|
||||||
|
if initial_oidc_states:
|
||||||
|
request.session["oidc_states"] = initial_oidc_states
|
||||||
|
request.session.save()
|
||||||
|
|
||||||
|
mocked_state = "mock_state"
|
||||||
|
|
||||||
|
OIDCLogoutView().persist_state(request, mocked_state)
|
||||||
|
|
||||||
|
assert "oidc_states" in request.session
|
||||||
|
assert request.session["oidc_states"] == {
|
||||||
|
"mock_state": {},
|
||||||
|
**initial_oidc_states,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@override_settings(OIDC_OP_LOGOUT_ENDPOINT="/example-logout")
|
||||||
|
@mock.patch.object(OIDCLogoutView, "persist_state")
|
||||||
|
@mock.patch.object(crypto, "get_random_string", return_value="mocked_state")
|
||||||
|
def test_view_logout_construct_oidc_logout_url(
|
||||||
|
mocked_get_random_string, mocked_persist_state
|
||||||
|
):
|
||||||
|
"""Should construct the logout URL to initiate the logout flow with the OIDC provider."""
|
||||||
|
|
||||||
|
user = factories.UserFactory()
|
||||||
|
|
||||||
|
request = RequestFactory().request()
|
||||||
|
request.user = user
|
||||||
|
|
||||||
|
middleware = SessionMiddleware(get_response=lambda x: x)
|
||||||
|
middleware.process_request(request)
|
||||||
|
|
||||||
|
request.session["oidc_id_token"] = "mocked_oidc_id_token"
|
||||||
|
request.session.save()
|
||||||
|
|
||||||
|
redirect_url = OIDCLogoutView().construct_oidc_logout_url(request)
|
||||||
|
|
||||||
|
mocked_persist_state.assert_called_once()
|
||||||
|
mocked_get_random_string.assert_called_once()
|
||||||
|
|
||||||
|
params = parse_qs(urlparse(redirect_url).query)
|
||||||
|
|
||||||
|
assert params["id_token_hint"][0] == "mocked_oidc_id_token"
|
||||||
|
assert params["state"][0] == "mocked_state"
|
||||||
|
|
||||||
|
url = reverse("oidc_logout_callback")
|
||||||
|
assert url in params["post_logout_redirect_uri"][0]
|
||||||
|
|
||||||
|
|
||||||
|
@override_settings(LOGOUT_REDIRECT_URL="/")
|
||||||
|
def test_view_logout_construct_oidc_logout_url_none_id_token():
|
||||||
|
"""If no ID token is available in the session,
|
||||||
|
the user should be redirected to the final URL."""
|
||||||
|
|
||||||
|
user = factories.UserFactory()
|
||||||
|
|
||||||
|
request = RequestFactory().request()
|
||||||
|
request.user = user
|
||||||
|
|
||||||
|
middleware = SessionMiddleware(get_response=lambda x: x)
|
||||||
|
middleware.process_request(request)
|
||||||
|
|
||||||
|
redirect_url = OIDCLogoutView().construct_oidc_logout_url(request)
|
||||||
|
|
||||||
|
assert redirect_url == "/"
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.parametrize(
|
||||||
|
"initial_state",
|
||||||
|
[None, {"other_state": "foo"}],
|
||||||
|
)
|
||||||
|
def test_view_logout_callback_wrong_state(initial_state):
|
||||||
|
"""Should raise an error if OIDC state doesn't match session data."""
|
||||||
|
|
||||||
|
user = factories.UserFactory()
|
||||||
|
|
||||||
|
request = RequestFactory().request()
|
||||||
|
request.user = user
|
||||||
|
|
||||||
|
middleware = SessionMiddleware(get_response=lambda x: x)
|
||||||
|
middleware.process_request(request)
|
||||||
|
|
||||||
|
if initial_state:
|
||||||
|
request.session["oidc_states"] = initial_state
|
||||||
|
request.session.save()
|
||||||
|
|
||||||
|
callback_view = OIDCLogoutCallbackView.as_view()
|
||||||
|
|
||||||
|
with pytest.raises(SuspiciousOperation) as excinfo:
|
||||||
|
callback_view(request)
|
||||||
|
|
||||||
|
assert (
|
||||||
|
str(excinfo.value) == "OIDC callback state not found in session `oidc_states`!"
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@override_settings(LOGOUT_REDIRECT_URL="/example-logout")
|
||||||
|
def test_view_logout_callback():
|
||||||
|
"""If state matches, callback should clear OIDC state and redirects."""
|
||||||
|
|
||||||
|
user = factories.UserFactory()
|
||||||
|
|
||||||
|
request = RequestFactory().get("/logout-callback/", data={"state": "mocked_state"})
|
||||||
|
request.user = user
|
||||||
|
|
||||||
|
middleware = SessionMiddleware(get_response=lambda x: x)
|
||||||
|
middleware.process_request(request)
|
||||||
|
|
||||||
|
mocked_state = "mocked_state"
|
||||||
|
|
||||||
|
request.session["oidc_states"] = {mocked_state: {}}
|
||||||
|
request.session.save()
|
||||||
|
|
||||||
|
callback_view = OIDCLogoutCallbackView.as_view()
|
||||||
|
|
||||||
|
with mock.patch("mozilla_django_oidc.views.auth.logout") as mock_logout:
|
||||||
|
|
||||||
|
def clear_user(request):
|
||||||
|
# Assert state is cleared prior to logout
|
||||||
|
assert request.session["oidc_states"] == {}
|
||||||
|
request.user = AnonymousUser()
|
||||||
|
|
||||||
|
mock_logout.side_effect = clear_user
|
||||||
|
response = callback_view(request)
|
||||||
|
mock_logout.assert_called_once()
|
||||||
|
|
||||||
|
assert response.status_code == 302
|
||||||
|
assert response.url == "/example-logout"
|
||||||
15
src/backend/core/tests/conftest.py
Normal file
15
src/backend/core/tests/conftest.py
Normal file
@@ -0,0 +1,15 @@
|
|||||||
|
"""Fixtures for tests in the impress core application"""
|
||||||
|
from unittest import mock
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
|
||||||
|
USER = "user"
|
||||||
|
TEAM = "team"
|
||||||
|
VIA = [USER, TEAM]
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def mock_user_get_teams():
|
||||||
|
"""Mock for the "get_teams" method on the User model."""
|
||||||
|
with mock.patch("core.models.User.get_teams") as mock_get_teams:
|
||||||
|
yield mock_get_teams
|
||||||
0
src/backend/core/tests/swagger/__init__.py
Normal file
0
src/backend/core/tests/swagger/__init__.py
Normal file
41
src/backend/core/tests/swagger/test_openapi_schema.py
Normal file
41
src/backend/core/tests/swagger/test_openapi_schema.py
Normal file
@@ -0,0 +1,41 @@
|
|||||||
|
"""
|
||||||
|
Test suite for generated openapi schema.
|
||||||
|
"""
|
||||||
|
import json
|
||||||
|
from io import StringIO
|
||||||
|
|
||||||
|
from django.core.management import call_command
|
||||||
|
from django.test import Client
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
|
||||||
|
pytestmark = pytest.mark.django_db
|
||||||
|
|
||||||
|
|
||||||
|
def test_openapi_client_schema():
|
||||||
|
"""
|
||||||
|
Generated and served OpenAPI client schema should be correct.
|
||||||
|
"""
|
||||||
|
# Start by generating the swagger.json file
|
||||||
|
output = StringIO()
|
||||||
|
call_command(
|
||||||
|
"spectacular",
|
||||||
|
"--api-version",
|
||||||
|
"v1.0",
|
||||||
|
"--urlconf",
|
||||||
|
"core.urls",
|
||||||
|
"--format",
|
||||||
|
"openapi-json",
|
||||||
|
"--file",
|
||||||
|
"core/tests/swagger/swagger.json",
|
||||||
|
stdout=output,
|
||||||
|
)
|
||||||
|
assert output.getvalue() == ""
|
||||||
|
|
||||||
|
response = Client().get("/v1.0/swagger.json")
|
||||||
|
|
||||||
|
assert response.status_code == 200
|
||||||
|
with open(
|
||||||
|
"core/tests/swagger/swagger.json", "r", encoding="utf-8"
|
||||||
|
) as expected_schema:
|
||||||
|
assert response.json() == json.load(expected_schema)
|
||||||
417
src/backend/core/tests/test_api_users.py
Normal file
417
src/backend/core/tests/test_api_users.py
Normal file
@@ -0,0 +1,417 @@
|
|||||||
|
"""
|
||||||
|
Test users API endpoints in the impress core app.
|
||||||
|
"""
|
||||||
|
import pytest
|
||||||
|
from rest_framework.test import APIClient
|
||||||
|
|
||||||
|
from core import factories, models
|
||||||
|
from core.api import serializers
|
||||||
|
|
||||||
|
pytestmark = pytest.mark.django_db
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_users_list_anonymous():
|
||||||
|
"""Anonymous users should not be allowed to list users."""
|
||||||
|
factories.UserFactory()
|
||||||
|
client = APIClient()
|
||||||
|
response = client.get("/api/v1.0/users/")
|
||||||
|
assert response.status_code == 401
|
||||||
|
assert response.json() == {
|
||||||
|
"detail": "Authentication credentials were not provided."
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_users_list_authenticated():
|
||||||
|
"""
|
||||||
|
Authenticated users should be able to list users.
|
||||||
|
"""
|
||||||
|
user = factories.UserFactory()
|
||||||
|
|
||||||
|
client = APIClient()
|
||||||
|
client.force_login(user)
|
||||||
|
|
||||||
|
factories.UserFactory.create_batch(2)
|
||||||
|
response = client.get(
|
||||||
|
"/api/v1.0/users/",
|
||||||
|
)
|
||||||
|
assert response.status_code == 200
|
||||||
|
content = response.json()
|
||||||
|
assert len(content["results"]) == 3
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_users_list_query_email():
|
||||||
|
"""
|
||||||
|
Authenticated users should be able to list users
|
||||||
|
and filter by email.
|
||||||
|
"""
|
||||||
|
user = factories.UserFactory()
|
||||||
|
|
||||||
|
client = APIClient()
|
||||||
|
client.force_login(user)
|
||||||
|
|
||||||
|
dave = factories.UserFactory(email="david.bowman@work.com")
|
||||||
|
nicole = factories.UserFactory(email="nicole_foole@work.com")
|
||||||
|
frank = factories.UserFactory(email="frank_poole@work.com")
|
||||||
|
factories.UserFactory(email="heywood_floyd@work.com")
|
||||||
|
|
||||||
|
response = client.get(
|
||||||
|
"/api/v1.0/users/?q=david.bowman@work.com",
|
||||||
|
)
|
||||||
|
assert response.status_code == 200
|
||||||
|
user_ids = [user["id"] for user in response.json()["results"]]
|
||||||
|
assert user_ids == [str(dave.id)]
|
||||||
|
|
||||||
|
response = client.get("/api/v1.0/users/?q=oole")
|
||||||
|
|
||||||
|
assert response.status_code == 200
|
||||||
|
user_ids = [user["id"] for user in response.json()["results"]]
|
||||||
|
assert user_ids == [str(nicole.id), str(frank.id)]
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_users_retrieve_me_anonymous():
|
||||||
|
"""Anonymous users should not be allowed to list users."""
|
||||||
|
factories.UserFactory.create_batch(2)
|
||||||
|
client = APIClient()
|
||||||
|
response = client.get("/api/v1.0/users/me/")
|
||||||
|
assert response.status_code == 401
|
||||||
|
assert response.json() == {
|
||||||
|
"detail": "Authentication credentials were not provided."
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_users_retrieve_me_authenticated():
|
||||||
|
"""Authenticated users should be able to retrieve their own user via the "/users/me" path."""
|
||||||
|
user = factories.UserFactory()
|
||||||
|
|
||||||
|
client = APIClient()
|
||||||
|
client.force_login(user)
|
||||||
|
|
||||||
|
factories.UserFactory.create_batch(2)
|
||||||
|
response = client.get(
|
||||||
|
"/api/v1.0/users/me/",
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == 200
|
||||||
|
assert response.json() == {
|
||||||
|
"id": str(user.id),
|
||||||
|
"email": user.email,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_users_retrieve_anonymous():
|
||||||
|
"""Anonymous users should not be allowed to retrieve a user."""
|
||||||
|
client = APIClient()
|
||||||
|
user = factories.UserFactory()
|
||||||
|
response = client.get(f"/api/v1.0/users/{user.id!s}/")
|
||||||
|
|
||||||
|
assert response.status_code == 401
|
||||||
|
assert response.json() == {
|
||||||
|
"detail": "Authentication credentials were not provided."
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_users_retrieve_authenticated_self():
|
||||||
|
"""
|
||||||
|
Authenticated users should be allowed to retrieve their own user.
|
||||||
|
The returned object should not contain the password.
|
||||||
|
"""
|
||||||
|
user = factories.UserFactory()
|
||||||
|
|
||||||
|
client = APIClient()
|
||||||
|
client.force_login(user)
|
||||||
|
|
||||||
|
response = client.get(
|
||||||
|
f"/api/v1.0/users/{user.id!s}/",
|
||||||
|
)
|
||||||
|
assert response.status_code == 405
|
||||||
|
assert response.json() == {"detail": 'Method "GET" not allowed.'}
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_users_retrieve_authenticated_other():
|
||||||
|
"""
|
||||||
|
Authenticated users should be able to retrieve another user's detail view with
|
||||||
|
limited information.
|
||||||
|
"""
|
||||||
|
user = factories.UserFactory()
|
||||||
|
|
||||||
|
client = APIClient()
|
||||||
|
client.force_login(user)
|
||||||
|
|
||||||
|
other_user = factories.UserFactory()
|
||||||
|
|
||||||
|
response = client.get(
|
||||||
|
f"/api/v1.0/users/{other_user.id!s}/",
|
||||||
|
)
|
||||||
|
assert response.status_code == 405
|
||||||
|
assert response.json() == {"detail": 'Method "GET" not allowed.'}
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_users_create_anonymous():
|
||||||
|
"""Anonymous users should not be able to create users via the API."""
|
||||||
|
response = APIClient().post(
|
||||||
|
"/api/v1.0/users/",
|
||||||
|
{
|
||||||
|
"language": "fr-fr",
|
||||||
|
"password": "mypassword",
|
||||||
|
},
|
||||||
|
)
|
||||||
|
assert response.status_code == 401
|
||||||
|
assert response.json() == {
|
||||||
|
"detail": "Authentication credentials were not provided."
|
||||||
|
}
|
||||||
|
assert models.User.objects.exists() is False
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_users_create_authenticated():
|
||||||
|
"""Authenticated users should not be able to create users via the API."""
|
||||||
|
user = factories.UserFactory()
|
||||||
|
|
||||||
|
client = APIClient()
|
||||||
|
client.force_login(user)
|
||||||
|
|
||||||
|
response = client.post(
|
||||||
|
"/api/v1.0/users/",
|
||||||
|
{
|
||||||
|
"language": "fr-fr",
|
||||||
|
"password": "mypassword",
|
||||||
|
},
|
||||||
|
format="json",
|
||||||
|
)
|
||||||
|
assert response.status_code == 405
|
||||||
|
assert response.json() == {"detail": 'Method "POST" not allowed.'}
|
||||||
|
assert models.User.objects.exclude(id=user.id).exists() is False
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_users_update_anonymous():
|
||||||
|
"""Anonymous users should not be able to update users via the API."""
|
||||||
|
user = factories.UserFactory()
|
||||||
|
|
||||||
|
old_user_values = dict(serializers.UserSerializer(instance=user).data)
|
||||||
|
new_user_values = serializers.UserSerializer(instance=factories.UserFactory()).data
|
||||||
|
|
||||||
|
response = APIClient().put(
|
||||||
|
f"/api/v1.0/users/{user.id!s}/",
|
||||||
|
new_user_values,
|
||||||
|
format="json",
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == 401
|
||||||
|
assert response.json() == {
|
||||||
|
"detail": "Authentication credentials were not provided."
|
||||||
|
}
|
||||||
|
|
||||||
|
user.refresh_from_db()
|
||||||
|
user_values = dict(serializers.UserSerializer(instance=user).data)
|
||||||
|
for key, value in user_values.items():
|
||||||
|
assert value == old_user_values[key]
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_users_update_authenticated_self():
|
||||||
|
"""
|
||||||
|
Authenticated users should be able to update their own user but only "language"
|
||||||
|
and "timezone" fields.
|
||||||
|
"""
|
||||||
|
user = factories.UserFactory()
|
||||||
|
|
||||||
|
client = APIClient()
|
||||||
|
client.force_login(user)
|
||||||
|
|
||||||
|
old_user_values = dict(serializers.UserSerializer(instance=user).data)
|
||||||
|
new_user_values = dict(
|
||||||
|
serializers.UserSerializer(instance=factories.UserFactory()).data
|
||||||
|
)
|
||||||
|
|
||||||
|
response = client.put(
|
||||||
|
f"/api/v1.0/users/{user.id!s}/",
|
||||||
|
new_user_values,
|
||||||
|
format="json",
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == 200
|
||||||
|
user.refresh_from_db()
|
||||||
|
user_values = dict(serializers.UserSerializer(instance=user).data)
|
||||||
|
for key, value in user_values.items():
|
||||||
|
if key in ["language", "timezone"]:
|
||||||
|
assert value == new_user_values[key]
|
||||||
|
else:
|
||||||
|
assert value == old_user_values[key]
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_users_update_authenticated_other():
|
||||||
|
"""Authenticated users should not be allowed to update other users."""
|
||||||
|
user = factories.UserFactory()
|
||||||
|
|
||||||
|
client = APIClient()
|
||||||
|
client.force_login(user)
|
||||||
|
|
||||||
|
user = factories.UserFactory()
|
||||||
|
old_user_values = dict(serializers.UserSerializer(instance=user).data)
|
||||||
|
new_user_values = serializers.UserSerializer(instance=factories.UserFactory()).data
|
||||||
|
|
||||||
|
response = client.put(
|
||||||
|
f"/api/v1.0/users/{user.id!s}/",
|
||||||
|
new_user_values,
|
||||||
|
format="json",
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == 403
|
||||||
|
user.refresh_from_db()
|
||||||
|
user_values = dict(serializers.UserSerializer(instance=user).data)
|
||||||
|
for key, value in user_values.items():
|
||||||
|
assert value == old_user_values[key]
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_users_patch_anonymous():
|
||||||
|
"""Anonymous users should not be able to patch users via the API."""
|
||||||
|
user = factories.UserFactory()
|
||||||
|
|
||||||
|
old_user_values = dict(serializers.UserSerializer(instance=user).data)
|
||||||
|
new_user_values = dict(
|
||||||
|
serializers.UserSerializer(instance=factories.UserFactory()).data
|
||||||
|
)
|
||||||
|
|
||||||
|
for key, new_value in new_user_values.items():
|
||||||
|
response = APIClient().patch(
|
||||||
|
f"/api/v1.0/users/{user.id!s}/",
|
||||||
|
{key: new_value},
|
||||||
|
format="json",
|
||||||
|
)
|
||||||
|
assert response.status_code == 401
|
||||||
|
assert response.json() == {
|
||||||
|
"detail": "Authentication credentials were not provided."
|
||||||
|
}
|
||||||
|
|
||||||
|
user.refresh_from_db()
|
||||||
|
user_values = dict(serializers.UserSerializer(instance=user).data)
|
||||||
|
for key, value in user_values.items():
|
||||||
|
assert value == old_user_values[key]
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_users_patch_authenticated_self():
|
||||||
|
"""
|
||||||
|
Authenticated users should be able to patch their own user but only "language"
|
||||||
|
and "timezone" fields.
|
||||||
|
"""
|
||||||
|
user = factories.UserFactory()
|
||||||
|
|
||||||
|
client = APIClient()
|
||||||
|
client.force_login(user)
|
||||||
|
|
||||||
|
old_user_values = dict(serializers.UserSerializer(instance=user).data)
|
||||||
|
new_user_values = dict(
|
||||||
|
serializers.UserSerializer(instance=factories.UserFactory()).data
|
||||||
|
)
|
||||||
|
|
||||||
|
for key, new_value in new_user_values.items():
|
||||||
|
response = client.patch(
|
||||||
|
f"/api/v1.0/users/{user.id!s}/",
|
||||||
|
{key: new_value},
|
||||||
|
format="json",
|
||||||
|
)
|
||||||
|
assert response.status_code == 200
|
||||||
|
|
||||||
|
user.refresh_from_db()
|
||||||
|
user_values = dict(serializers.UserSerializer(instance=user).data)
|
||||||
|
for key, value in user_values.items():
|
||||||
|
if key in ["language", "timezone"]:
|
||||||
|
assert value == new_user_values[key]
|
||||||
|
else:
|
||||||
|
assert value == old_user_values[key]
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_users_patch_authenticated_other():
|
||||||
|
"""Authenticated users should not be allowed to patch other users."""
|
||||||
|
user = factories.UserFactory()
|
||||||
|
|
||||||
|
client = APIClient()
|
||||||
|
client.force_login(user)
|
||||||
|
|
||||||
|
user = factories.UserFactory()
|
||||||
|
old_user_values = dict(serializers.UserSerializer(instance=user).data)
|
||||||
|
new_user_values = dict(
|
||||||
|
serializers.UserSerializer(instance=factories.UserFactory()).data
|
||||||
|
)
|
||||||
|
|
||||||
|
for key, new_value in new_user_values.items():
|
||||||
|
response = client.put(
|
||||||
|
f"/api/v1.0/users/{user.id!s}/",
|
||||||
|
{key: new_value},
|
||||||
|
format="json",
|
||||||
|
)
|
||||||
|
assert response.status_code == 403
|
||||||
|
|
||||||
|
user.refresh_from_db()
|
||||||
|
user_values = dict(serializers.UserSerializer(instance=user).data)
|
||||||
|
for key, value in user_values.items():
|
||||||
|
assert value == old_user_values[key]
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_users_delete_list_anonymous():
|
||||||
|
"""Anonymous users should not be allowed to delete a list of users."""
|
||||||
|
factories.UserFactory.create_batch(2)
|
||||||
|
|
||||||
|
client = APIClient()
|
||||||
|
response = client.delete("/api/v1.0/users/")
|
||||||
|
|
||||||
|
assert response.status_code == 401
|
||||||
|
assert models.User.objects.count() == 2
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_users_delete_list_authenticated():
|
||||||
|
"""Authenticated users should not be allowed to delete a list of users."""
|
||||||
|
factories.UserFactory.create_batch(2)
|
||||||
|
user = factories.UserFactory()
|
||||||
|
|
||||||
|
client = APIClient()
|
||||||
|
client.force_login(user)
|
||||||
|
|
||||||
|
response = client.delete(
|
||||||
|
"/api/v1.0/users/",
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == 405
|
||||||
|
assert models.User.objects.count() == 3
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_users_delete_anonymous():
|
||||||
|
"""Anonymous users should not be allowed to delete a user."""
|
||||||
|
user = factories.UserFactory()
|
||||||
|
|
||||||
|
response = APIClient().delete(f"/api/v1.0/users/{user.id!s}/")
|
||||||
|
|
||||||
|
assert response.status_code == 401
|
||||||
|
assert models.User.objects.count() == 1
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_users_delete_authenticated():
|
||||||
|
"""
|
||||||
|
Authenticated users should not be allowed to delete a user other than themselves.
|
||||||
|
"""
|
||||||
|
user = factories.UserFactory()
|
||||||
|
|
||||||
|
client = APIClient()
|
||||||
|
client.force_login(user)
|
||||||
|
|
||||||
|
other_user = factories.UserFactory()
|
||||||
|
|
||||||
|
response = client.delete(
|
||||||
|
f"/api/v1.0/users/{other_user.id!s}/",
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == 405
|
||||||
|
assert models.User.objects.count() == 2
|
||||||
|
|
||||||
|
|
||||||
|
def test_api_users_delete_self():
|
||||||
|
"""Authenticated users should not be able to delete their own user."""
|
||||||
|
user = factories.UserFactory()
|
||||||
|
|
||||||
|
client = APIClient()
|
||||||
|
client.force_login(user)
|
||||||
|
|
||||||
|
response = client.delete(
|
||||||
|
f"/api/v1.0/users/{user.id!s}/",
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == 405
|
||||||
|
assert models.User.objects.count() == 1
|
||||||
45
src/backend/core/tests/test_models_users.py
Normal file
45
src/backend/core/tests/test_models_users.py
Normal file
@@ -0,0 +1,45 @@
|
|||||||
|
"""
|
||||||
|
Unit tests for the User model
|
||||||
|
"""
|
||||||
|
from unittest import mock
|
||||||
|
|
||||||
|
from django.core.exceptions import ValidationError
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
|
||||||
|
from core import factories
|
||||||
|
|
||||||
|
pytestmark = pytest.mark.django_db
|
||||||
|
|
||||||
|
|
||||||
|
def test_models_users_str():
|
||||||
|
"""The str representation should be the email."""
|
||||||
|
user = factories.UserFactory()
|
||||||
|
assert str(user) == user.email
|
||||||
|
|
||||||
|
|
||||||
|
def test_models_users_id_unique():
|
||||||
|
"""The "id" field should be unique."""
|
||||||
|
user = factories.UserFactory()
|
||||||
|
with pytest.raises(ValidationError, match="User with this Id already exists."):
|
||||||
|
factories.UserFactory(id=user.id)
|
||||||
|
|
||||||
|
|
||||||
|
def test_models_users_send_mail_main_existing():
|
||||||
|
"""The "email_user' method should send mail to the user's email address."""
|
||||||
|
user = factories.UserFactory()
|
||||||
|
|
||||||
|
with mock.patch("django.core.mail.send_mail") as mock_send:
|
||||||
|
user.email_user("my subject", "my message")
|
||||||
|
|
||||||
|
mock_send.assert_called_once_with("my subject", "my message", None, [user.email])
|
||||||
|
|
||||||
|
|
||||||
|
def test_models_users_send_mail_main_missing():
|
||||||
|
"""The "email_user' method should fail if the user has no email address."""
|
||||||
|
user = factories.UserFactory(email=None)
|
||||||
|
|
||||||
|
with pytest.raises(ValueError) as excinfo:
|
||||||
|
user.email_user("my subject", "my message")
|
||||||
|
|
||||||
|
assert str(excinfo.value) == "User has no email address."
|
||||||
24
src/backend/core/urls.py
Normal file
24
src/backend/core/urls.py
Normal file
@@ -0,0 +1,24 @@
|
|||||||
|
"""URL configuration for the core app."""
|
||||||
|
from django.conf import settings
|
||||||
|
from django.urls import include, path
|
||||||
|
|
||||||
|
from rest_framework.routers import DefaultRouter
|
||||||
|
|
||||||
|
from core.api import viewsets
|
||||||
|
from core.authentication.urls import urlpatterns as oidc_urls
|
||||||
|
|
||||||
|
# - Main endpoints
|
||||||
|
router = DefaultRouter()
|
||||||
|
router.register("users", viewsets.UserViewSet, basename="users")
|
||||||
|
|
||||||
|
urlpatterns = [
|
||||||
|
path(
|
||||||
|
f"api/{settings.API_VERSION}/",
|
||||||
|
include(
|
||||||
|
[
|
||||||
|
*router.urls,
|
||||||
|
*oidc_urls,
|
||||||
|
]
|
||||||
|
),
|
||||||
|
),
|
||||||
|
]
|
||||||
0
src/backend/demo/__init__.py
Normal file
0
src/backend/demo/__init__.py
Normal file
10
src/backend/demo/data/template/code.txt
Normal file
10
src/backend/demo/data/template/code.txt
Normal file
@@ -0,0 +1,10 @@
|
|||||||
|
<page size="A4">
|
||||||
|
<div class="header">
|
||||||
|
<img
|
||||||
|
src="https://upload.wikimedia.org/wikipedia/fr/7/72/Logo_du_Gouvernement_de_la_R%C3%A9publique_fran%C3%A7aise_%282020%29.svg"
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
<div class="content">
|
||||||
|
<div class="body">{{ body }}</div>
|
||||||
|
</div>
|
||||||
|
</page>
|
||||||
18
src/backend/demo/data/template/css.txt
Normal file
18
src/backend/demo/data/template/css.txt
Normal file
@@ -0,0 +1,18 @@
|
|||||||
|
body {
|
||||||
|
background: white;
|
||||||
|
font-family: arial
|
||||||
|
}
|
||||||
|
.header {
|
||||||
|
display: flex;
|
||||||
|
justify-content: space-between;
|
||||||
|
}
|
||||||
|
.header img {
|
||||||
|
width: 5cm;
|
||||||
|
margin-left: -0.4cm;
|
||||||
|
}
|
||||||
|
.body{
|
||||||
|
margin-top: 1.5rem
|
||||||
|
}
|
||||||
|
img {
|
||||||
|
max-width: 100%;
|
||||||
|
}
|
||||||
5
src/backend/demo/defaults.py
Normal file
5
src/backend/demo/defaults.py
Normal file
@@ -0,0 +1,5 @@
|
|||||||
|
"""Parameters that define how the demo site will be built."""
|
||||||
|
|
||||||
|
NB_OBJECTS = {
|
||||||
|
"users": 100,
|
||||||
|
}
|
||||||
0
src/backend/demo/management/__init__.py
Normal file
0
src/backend/demo/management/__init__.py
Normal file
0
src/backend/demo/management/commands/__init__.py
Normal file
0
src/backend/demo/management/commands/__init__.py
Normal file
154
src/backend/demo/management/commands/create_demo.py
Normal file
154
src/backend/demo/management/commands/create_demo.py
Normal file
@@ -0,0 +1,154 @@
|
|||||||
|
# ruff: noqa: S311, S106
|
||||||
|
"""create_demo management command"""
|
||||||
|
|
||||||
|
import logging
|
||||||
|
import random
|
||||||
|
import time
|
||||||
|
from collections import defaultdict
|
||||||
|
|
||||||
|
from django import db
|
||||||
|
from django.conf import settings
|
||||||
|
from django.core.management.base import BaseCommand, CommandError
|
||||||
|
|
||||||
|
from faker import Faker
|
||||||
|
|
||||||
|
from core import models
|
||||||
|
|
||||||
|
from demo import defaults
|
||||||
|
|
||||||
|
fake = Faker()
|
||||||
|
|
||||||
|
logger = logging.getLogger("impress.commands.demo.create_demo")
|
||||||
|
|
||||||
|
|
||||||
|
def random_true_with_probability(probability):
|
||||||
|
"""return True with the requested probability, False otherwise."""
|
||||||
|
return random.random() < probability
|
||||||
|
|
||||||
|
|
||||||
|
class BulkQueue:
|
||||||
|
"""A utility class to create Django model instances in bulk by just pushing to a queue."""
|
||||||
|
|
||||||
|
BATCH_SIZE = 20000
|
||||||
|
|
||||||
|
def __init__(self, stdout, *args, **kwargs):
|
||||||
|
"""Define the queue as a dict of lists."""
|
||||||
|
self.queue = defaultdict(list)
|
||||||
|
self.stdout = stdout
|
||||||
|
|
||||||
|
def _bulk_create(self, objects):
|
||||||
|
"""Actually create instances in bulk in the database."""
|
||||||
|
if not objects:
|
||||||
|
return
|
||||||
|
|
||||||
|
objects[0]._meta.model.objects.bulk_create(objects, ignore_conflicts=False) # noqa: SLF001
|
||||||
|
# In debug mode, Django keeps query cache which creates a memory leak in this case
|
||||||
|
db.reset_queries()
|
||||||
|
self.queue[objects[0]._meta.model.__name__] = [] # noqa: SLF001
|
||||||
|
|
||||||
|
def push(self, obj):
|
||||||
|
"""Add a model instance to queue to that it gets created in bulk."""
|
||||||
|
objects = self.queue[obj._meta.model.__name__] # noqa: SLF001
|
||||||
|
objects.append(obj)
|
||||||
|
if len(objects) > self.BATCH_SIZE:
|
||||||
|
self._bulk_create(objects)
|
||||||
|
self.stdout.write(".", ending="")
|
||||||
|
|
||||||
|
def flush(self):
|
||||||
|
"""Flush the queue after creating the remaining model instances."""
|
||||||
|
for objects in self.queue.values():
|
||||||
|
self._bulk_create(objects)
|
||||||
|
|
||||||
|
|
||||||
|
class Timeit:
|
||||||
|
"""A utility context manager/method decorator to time execution."""
|
||||||
|
|
||||||
|
total_time = 0
|
||||||
|
|
||||||
|
def __init__(self, stdout, sentence=None):
|
||||||
|
"""Set the sentence to be displayed for timing information."""
|
||||||
|
self.sentence = sentence
|
||||||
|
self.start = None
|
||||||
|
self.stdout = stdout
|
||||||
|
|
||||||
|
def __call__(self, func):
|
||||||
|
"""Behavior on call for use as a method decorator."""
|
||||||
|
|
||||||
|
def timeit_wrapper(*args, **kwargs):
|
||||||
|
"""wrapper to trigger/stop the timer before/after function call."""
|
||||||
|
self.__enter__()
|
||||||
|
result = func(*args, **kwargs)
|
||||||
|
self.__exit__(None, None, None)
|
||||||
|
return result
|
||||||
|
|
||||||
|
return timeit_wrapper
|
||||||
|
|
||||||
|
def __enter__(self):
|
||||||
|
"""Start timer upon entering context manager."""
|
||||||
|
self.start = time.perf_counter()
|
||||||
|
if self.sentence:
|
||||||
|
self.stdout.write(self.sentence, ending=".")
|
||||||
|
|
||||||
|
def __exit__(self, exc_type, exc_value, exc_tb):
|
||||||
|
"""Stop timer and display result upon leaving context manager."""
|
||||||
|
if exc_type is not None:
|
||||||
|
raise exc_type(exc_value)
|
||||||
|
end = time.perf_counter()
|
||||||
|
elapsed_time = end - self.start
|
||||||
|
if self.sentence:
|
||||||
|
self.stdout.write(f" Took {elapsed_time:g} seconds")
|
||||||
|
|
||||||
|
self.__class__.total_time += elapsed_time
|
||||||
|
return elapsed_time
|
||||||
|
|
||||||
|
|
||||||
|
def create_demo(stdout):
|
||||||
|
"""
|
||||||
|
Create a database with demo data for developers to work in a realistic environment.
|
||||||
|
The code is engineered to create a huge number of objects fast.
|
||||||
|
"""
|
||||||
|
|
||||||
|
queue = BulkQueue(stdout)
|
||||||
|
|
||||||
|
with Timeit(stdout, "Creating users"):
|
||||||
|
for i in range(defaults.NB_OBJECTS["users"]):
|
||||||
|
queue.push(
|
||||||
|
models.User(
|
||||||
|
admin_email=f"user{i:d}@example.com",
|
||||||
|
email=f"user{i:d}@example.com",
|
||||||
|
password="!",
|
||||||
|
is_superuser=False,
|
||||||
|
is_active=True,
|
||||||
|
is_staff=False,
|
||||||
|
language=random.choice(settings.LANGUAGES)[0],
|
||||||
|
)
|
||||||
|
)
|
||||||
|
queue.flush()
|
||||||
|
|
||||||
|
|
||||||
|
class Command(BaseCommand):
|
||||||
|
"""A management command to create a demo database."""
|
||||||
|
|
||||||
|
help = __doc__
|
||||||
|
|
||||||
|
def add_arguments(self, parser):
|
||||||
|
"""Add argument to require forcing execution when not in debug mode."""
|
||||||
|
parser.add_argument(
|
||||||
|
"-f",
|
||||||
|
"--force",
|
||||||
|
action="store_true",
|
||||||
|
default=False,
|
||||||
|
help="Force command execution despite DEBUG is set to False",
|
||||||
|
)
|
||||||
|
|
||||||
|
def handle(self, *args, **options):
|
||||||
|
"""Handling of the management command."""
|
||||||
|
if not settings.DEBUG and not options["force"]:
|
||||||
|
raise CommandError(
|
||||||
|
(
|
||||||
|
"This command is not meant to be used in production environment "
|
||||||
|
"except you know what you are doing, if so use --force parameter"
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
create_demo(self.stdout)
|
||||||
46
src/backend/demo/management/commands/createsuperuser.py
Normal file
46
src/backend/demo/management/commands/createsuperuser.py
Normal file
@@ -0,0 +1,46 @@
|
|||||||
|
"""Management user to create a superuser."""
|
||||||
|
from django.contrib.auth import get_user_model
|
||||||
|
from django.core.management.base import BaseCommand
|
||||||
|
|
||||||
|
UserModel = get_user_model()
|
||||||
|
|
||||||
|
|
||||||
|
class Command(BaseCommand):
|
||||||
|
"""Management command to create a superuser from and email and password."""
|
||||||
|
|
||||||
|
help = "Create a superuser with an email and a password"
|
||||||
|
|
||||||
|
def add_arguments(self, parser):
|
||||||
|
"""Define required arguments "email" and "password"."""
|
||||||
|
parser.add_argument(
|
||||||
|
"--email",
|
||||||
|
help=("Email for the user."),
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"--password",
|
||||||
|
help="Password for the user.",
|
||||||
|
)
|
||||||
|
|
||||||
|
def handle(self, *args, **options):
|
||||||
|
"""
|
||||||
|
Given an email and a password, create a superuser or upgrade the existing
|
||||||
|
user to superuser status.
|
||||||
|
"""
|
||||||
|
email = options.get("email")
|
||||||
|
try:
|
||||||
|
user = UserModel.objects.get(admin_email=email)
|
||||||
|
except UserModel.DoesNotExist:
|
||||||
|
user = UserModel(admin_email=email)
|
||||||
|
message = "Superuser created successfully."
|
||||||
|
else:
|
||||||
|
if user.is_superuser and user.is_staff:
|
||||||
|
message = "Superuser already exists."
|
||||||
|
else:
|
||||||
|
message = "User already existed and was upgraded to superuser."
|
||||||
|
|
||||||
|
user.is_superuser = True
|
||||||
|
user.is_staff = True
|
||||||
|
user.set_password(options["password"])
|
||||||
|
user.save()
|
||||||
|
|
||||||
|
self.stdout.write(self.style.SUCCESS(message))
|
||||||
0
src/backend/demo/tests/__init__.py
Normal file
0
src/backend/demo/tests/__init__.py
Normal file
18
src/backend/demo/tests/test_commands_create_demo.py
Normal file
18
src/backend/demo/tests/test_commands_create_demo.py
Normal file
@@ -0,0 +1,18 @@
|
|||||||
|
"""Test the `create_demo` management command"""
|
||||||
|
|
||||||
|
from django.core.management import call_command
|
||||||
|
from django.test import override_settings
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
|
||||||
|
from core import models
|
||||||
|
|
||||||
|
pytestmark = pytest.mark.django_db
|
||||||
|
|
||||||
|
|
||||||
|
@override_settings(DEBUG=True)
|
||||||
|
def test_commands_create_demo():
|
||||||
|
"""The create_demo management command should create objects as expected."""
|
||||||
|
call_command("create_demo")
|
||||||
|
|
||||||
|
assert models.User.objects.count() == 100
|
||||||
0
src/backend/impress/__init__.py
Normal file
0
src/backend/impress/__init__.py
Normal file
22
src/backend/impress/celery_app.py
Normal file
22
src/backend/impress/celery_app.py
Normal file
@@ -0,0 +1,22 @@
|
|||||||
|
"""Impress celery configuration file."""
|
||||||
|
import os
|
||||||
|
|
||||||
|
from celery import Celery
|
||||||
|
from configurations.importer import install
|
||||||
|
|
||||||
|
# Set the default Django settings module for the 'celery' program.
|
||||||
|
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "impress.settings")
|
||||||
|
os.environ.setdefault("DJANGO_CONFIGURATION", "Development")
|
||||||
|
|
||||||
|
install(check_options=True)
|
||||||
|
|
||||||
|
app = Celery("impress")
|
||||||
|
|
||||||
|
# Using a string here means the worker doesn't have to serialize
|
||||||
|
# the configuration object to child processes.
|
||||||
|
# - namespace='CELERY' means all celery-related configuration keys
|
||||||
|
# should have a `CELERY_` prefix.
|
||||||
|
app.config_from_object("django.conf:settings", namespace="CELERY")
|
||||||
|
|
||||||
|
# Load task modules from all registered Django apps.
|
||||||
|
app.autodiscover_tasks()
|
||||||
590
src/backend/impress/settings.py
Executable file
590
src/backend/impress/settings.py
Executable file
@@ -0,0 +1,590 @@
|
|||||||
|
"""
|
||||||
|
Django settings for impress project.
|
||||||
|
|
||||||
|
Generated by 'django-admin startproject' using Django 3.1.5.
|
||||||
|
|
||||||
|
For more information on this file, see
|
||||||
|
https://docs.djangoproject.com/en/3.1/topics/settings/
|
||||||
|
|
||||||
|
For the full list of settings and their values, see
|
||||||
|
https://docs.djangoproject.com/en/3.1/ref/settings/
|
||||||
|
"""
|
||||||
|
import json
|
||||||
|
import os
|
||||||
|
|
||||||
|
from django.utils.translation import gettext_lazy as _
|
||||||
|
|
||||||
|
import sentry_sdk
|
||||||
|
from configurations import Configuration, values
|
||||||
|
from sentry_sdk.integrations.django import DjangoIntegration
|
||||||
|
|
||||||
|
# Build paths inside the project like this: BASE_DIR / 'subdir'.
|
||||||
|
BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
|
||||||
|
DATA_DIR = os.path.join("/", "data")
|
||||||
|
|
||||||
|
|
||||||
|
def get_release():
|
||||||
|
"""
|
||||||
|
Get the current release of the application
|
||||||
|
|
||||||
|
By release, we mean the release from the version.json file à la Mozilla [1]
|
||||||
|
(if any). If this file has not been found, it defaults to "NA".
|
||||||
|
|
||||||
|
[1]
|
||||||
|
https://github.com/mozilla-services/Dockerflow/blob/master/docs/version_object.md
|
||||||
|
"""
|
||||||
|
# Try to get the current release from the version.json file generated by the
|
||||||
|
# CI during the Docker image build
|
||||||
|
try:
|
||||||
|
with open(os.path.join(BASE_DIR, "version.json"), encoding="utf8") as version:
|
||||||
|
return json.load(version)["version"]
|
||||||
|
except FileNotFoundError:
|
||||||
|
return "NA" # Default: not available
|
||||||
|
|
||||||
|
|
||||||
|
class Base(Configuration):
|
||||||
|
"""
|
||||||
|
This is the base configuration every configuration (aka environment) should inherit from. It
|
||||||
|
is recommended to configure third-party applications by creating a configuration mixins in
|
||||||
|
./configurations and compose the Base configuration with those mixins.
|
||||||
|
|
||||||
|
It depends on an environment variable that SHOULD be defined:
|
||||||
|
|
||||||
|
* DJANGO_SECRET_KEY
|
||||||
|
|
||||||
|
You may also want to override default configuration by setting the following environment
|
||||||
|
variables:
|
||||||
|
|
||||||
|
* DJANGO_SENTRY_DSN
|
||||||
|
* DB_NAME
|
||||||
|
* DB_HOST
|
||||||
|
* DB_PASSWORD
|
||||||
|
* DB_USER
|
||||||
|
"""
|
||||||
|
|
||||||
|
DEBUG = False
|
||||||
|
USE_SWAGGER = False
|
||||||
|
|
||||||
|
API_VERSION = "v1.0"
|
||||||
|
|
||||||
|
# Security
|
||||||
|
ALLOWED_HOSTS = values.ListValue([])
|
||||||
|
SECRET_KEY = values.Value(None)
|
||||||
|
|
||||||
|
# Application definition
|
||||||
|
ROOT_URLCONF = "impress.urls"
|
||||||
|
WSGI_APPLICATION = "impress.wsgi.application"
|
||||||
|
|
||||||
|
# Database
|
||||||
|
DATABASES = {
|
||||||
|
"default": {
|
||||||
|
"ENGINE": values.Value(
|
||||||
|
"django.db.backends.postgresql_psycopg2",
|
||||||
|
environ_name="DB_ENGINE",
|
||||||
|
environ_prefix=None,
|
||||||
|
),
|
||||||
|
"NAME": values.Value(
|
||||||
|
"impress", environ_name="DB_NAME", environ_prefix=None
|
||||||
|
),
|
||||||
|
"USER": values.Value("dinum", environ_name="DB_USER", environ_prefix=None),
|
||||||
|
"PASSWORD": values.Value(
|
||||||
|
"pass", environ_name="DB_PASSWORD", environ_prefix=None
|
||||||
|
),
|
||||||
|
"HOST": values.Value(
|
||||||
|
"localhost", environ_name="DB_HOST", environ_prefix=None
|
||||||
|
),
|
||||||
|
"PORT": values.Value(5432, environ_name="DB_PORT", environ_prefix=None),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
DEFAULT_AUTO_FIELD = "django.db.models.AutoField"
|
||||||
|
|
||||||
|
# Static files (CSS, JavaScript, Images)
|
||||||
|
STATIC_URL = "/static/"
|
||||||
|
STATIC_ROOT = os.path.join(DATA_DIR, "static")
|
||||||
|
MEDIA_URL = "/media/"
|
||||||
|
MEDIA_ROOT = os.path.join(DATA_DIR, "media")
|
||||||
|
|
||||||
|
SITE_ID = 1
|
||||||
|
|
||||||
|
STORAGES = {
|
||||||
|
"default": {
|
||||||
|
"BACKEND": "storages.backends.s3.S3Storage",
|
||||||
|
},
|
||||||
|
"staticfiles": {
|
||||||
|
"BACKEND": values.Value(
|
||||||
|
"whitenoise.storage.CompressedManifestStaticFilesStorage",
|
||||||
|
environ_name="STORAGES_STATICFILES_BACKEND",
|
||||||
|
),
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
# Media
|
||||||
|
AWS_S3_ENDPOINT_URL = values.Value(
|
||||||
|
environ_name="AWS_S3_ENDPOINT_URL", environ_prefix=None
|
||||||
|
)
|
||||||
|
AWS_S3_ACCESS_KEY_ID = values.Value(
|
||||||
|
environ_name="AWS_S3_ACCESS_KEY_ID", environ_prefix=None
|
||||||
|
)
|
||||||
|
AWS_S3_SECRET_ACCESS_KEY = values.Value(
|
||||||
|
environ_name="AWS_S3_SECRET_ACCESS_KEY", environ_prefix=None
|
||||||
|
)
|
||||||
|
AWS_S3_REGION_NAME = values.Value(
|
||||||
|
environ_name="AWS_S3_REGION_NAME", environ_prefix=None
|
||||||
|
)
|
||||||
|
AWS_STORAGE_BUCKET_NAME = values.Value(
|
||||||
|
"impress-media-storage",
|
||||||
|
environ_name="AWS_STORAGE_BUCKET_NAME",
|
||||||
|
environ_prefix=None,
|
||||||
|
)
|
||||||
|
|
||||||
|
S3_VERSIONS_PAGE_SIZE = 50
|
||||||
|
|
||||||
|
# Internationalization
|
||||||
|
# https://docs.djangoproject.com/en/3.1/topics/i18n/
|
||||||
|
|
||||||
|
# Languages
|
||||||
|
LANGUAGE_CODE = values.Value("en-us")
|
||||||
|
|
||||||
|
DRF_NESTED_MULTIPART_PARSER = {
|
||||||
|
# output of parser is converted to querydict
|
||||||
|
# if is set to False, dict python is returned
|
||||||
|
"querydict": False,
|
||||||
|
}
|
||||||
|
|
||||||
|
# Careful! Languages should be ordered by priority, as this tuple is used to get
|
||||||
|
# fallback/default languages throughout the app.
|
||||||
|
LANGUAGES = values.SingleNestedTupleValue(
|
||||||
|
(
|
||||||
|
("en-us", _("English")),
|
||||||
|
("fr-fr", _("French")),
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
LOCALE_PATHS = (os.path.join(BASE_DIR, "locale"),)
|
||||||
|
|
||||||
|
TIME_ZONE = "UTC"
|
||||||
|
USE_I18N = True
|
||||||
|
USE_TZ = True
|
||||||
|
|
||||||
|
# Templates
|
||||||
|
TEMPLATES = [
|
||||||
|
{
|
||||||
|
"BACKEND": "django.template.backends.django.DjangoTemplates",
|
||||||
|
"DIRS": [os.path.join(BASE_DIR, "templates")],
|
||||||
|
"OPTIONS": {
|
||||||
|
"context_processors": [
|
||||||
|
"django.contrib.auth.context_processors.auth",
|
||||||
|
"django.contrib.messages.context_processors.messages",
|
||||||
|
"django.template.context_processors.csrf",
|
||||||
|
"django.template.context_processors.debug",
|
||||||
|
"django.template.context_processors.i18n",
|
||||||
|
"django.template.context_processors.media",
|
||||||
|
"django.template.context_processors.request",
|
||||||
|
"django.template.context_processors.tz",
|
||||||
|
],
|
||||||
|
"loaders": [
|
||||||
|
"django.template.loaders.filesystem.Loader",
|
||||||
|
"django.template.loaders.app_directories.Loader",
|
||||||
|
],
|
||||||
|
},
|
||||||
|
},
|
||||||
|
]
|
||||||
|
|
||||||
|
MIDDLEWARE = [
|
||||||
|
"django.middleware.security.SecurityMiddleware",
|
||||||
|
"whitenoise.middleware.WhiteNoiseMiddleware",
|
||||||
|
"django.contrib.sessions.middleware.SessionMiddleware",
|
||||||
|
"django.middleware.locale.LocaleMiddleware",
|
||||||
|
"django.middleware.clickjacking.XFrameOptionsMiddleware",
|
||||||
|
"corsheaders.middleware.CorsMiddleware",
|
||||||
|
"django.middleware.common.CommonMiddleware",
|
||||||
|
"django.middleware.csrf.CsrfViewMiddleware",
|
||||||
|
"django.contrib.auth.middleware.AuthenticationMiddleware",
|
||||||
|
"django.contrib.messages.middleware.MessageMiddleware",
|
||||||
|
"dockerflow.django.middleware.DockerflowMiddleware",
|
||||||
|
]
|
||||||
|
|
||||||
|
AUTHENTICATION_BACKENDS = [
|
||||||
|
"django.contrib.auth.backends.ModelBackend",
|
||||||
|
"core.authentication.backends.OIDCAuthenticationBackend",
|
||||||
|
]
|
||||||
|
|
||||||
|
# Django applications from the highest priority to the lowest
|
||||||
|
INSTALLED_APPS = [
|
||||||
|
# impress
|
||||||
|
"core",
|
||||||
|
"demo",
|
||||||
|
"drf_spectacular",
|
||||||
|
# Third party apps
|
||||||
|
"corsheaders",
|
||||||
|
"dockerflow.django",
|
||||||
|
"rest_framework",
|
||||||
|
"parler",
|
||||||
|
"easy_thumbnails",
|
||||||
|
# Django
|
||||||
|
"django.contrib.admin",
|
||||||
|
"django.contrib.auth",
|
||||||
|
"django.contrib.contenttypes",
|
||||||
|
"django.contrib.postgres",
|
||||||
|
"django.contrib.sessions",
|
||||||
|
"django.contrib.sites",
|
||||||
|
"django.contrib.messages",
|
||||||
|
"django.contrib.staticfiles",
|
||||||
|
# OIDC third party
|
||||||
|
"mozilla_django_oidc",
|
||||||
|
]
|
||||||
|
|
||||||
|
# Cache
|
||||||
|
CACHES = {
|
||||||
|
"default": {"BACKEND": "django.core.cache.backends.locmem.LocMemCache"},
|
||||||
|
}
|
||||||
|
|
||||||
|
REST_FRAMEWORK = {
|
||||||
|
"DEFAULT_AUTHENTICATION_CLASSES": (
|
||||||
|
"mozilla_django_oidc.contrib.drf.OIDCAuthentication",
|
||||||
|
"rest_framework.authentication.SessionAuthentication",
|
||||||
|
),
|
||||||
|
"DEFAULT_PARSER_CLASSES": [
|
||||||
|
"rest_framework.parsers.JSONParser",
|
||||||
|
"nested_multipart_parser.drf.DrfNestedParser",
|
||||||
|
],
|
||||||
|
"EXCEPTION_HANDLER": "core.api.exception_handler",
|
||||||
|
"DEFAULT_PAGINATION_CLASS": "rest_framework.pagination.PageNumberPagination",
|
||||||
|
"PAGE_SIZE": 20,
|
||||||
|
"DEFAULT_VERSIONING_CLASS": "rest_framework.versioning.URLPathVersioning",
|
||||||
|
"DEFAULT_SCHEMA_CLASS": "drf_spectacular.openapi.AutoSchema",
|
||||||
|
}
|
||||||
|
|
||||||
|
SPECTACULAR_SETTINGS = {
|
||||||
|
"TITLE": "Impress API",
|
||||||
|
"DESCRIPTION": "This is the impress API schema.",
|
||||||
|
"VERSION": "1.0.0",
|
||||||
|
"SERVE_INCLUDE_SCHEMA": False,
|
||||||
|
"ENABLE_DJANGO_DEPLOY_CHECK": values.BooleanValue(
|
||||||
|
default=False,
|
||||||
|
environ_name="SPECTACULAR_SETTINGS_ENABLE_DJANGO_DEPLOY_CHECK",
|
||||||
|
),
|
||||||
|
"COMPONENT_SPLIT_REQUEST": True,
|
||||||
|
# OTHER SETTINGS
|
||||||
|
"SWAGGER_UI_DIST": "SIDECAR", # shorthand to use the sidecar instead
|
||||||
|
"SWAGGER_UI_FAVICON_HREF": "SIDECAR",
|
||||||
|
"REDOC_DIST": "SIDECAR",
|
||||||
|
}
|
||||||
|
|
||||||
|
# Mail
|
||||||
|
EMAIL_BACKEND = values.Value("django.core.mail.backends.smtp.EmailBackend")
|
||||||
|
EMAIL_HOST = values.Value(None)
|
||||||
|
EMAIL_HOST_USER = values.Value(None)
|
||||||
|
EMAIL_HOST_PASSWORD = values.Value(None)
|
||||||
|
EMAIL_PORT = values.PositiveIntegerValue(None)
|
||||||
|
EMAIL_USE_TLS = values.BooleanValue(False)
|
||||||
|
EMAIL_FROM = values.Value("from@example.com")
|
||||||
|
|
||||||
|
AUTH_USER_MODEL = "core.User"
|
||||||
|
INVITATION_VALIDITY_DURATION = 604800 # 7 days, in seconds
|
||||||
|
|
||||||
|
# CORS
|
||||||
|
CORS_ALLOW_CREDENTIALS = True
|
||||||
|
CORS_ALLOW_ALL_ORIGINS = values.BooleanValue(True)
|
||||||
|
CORS_ALLOWED_ORIGINS = values.ListValue([])
|
||||||
|
CORS_ALLOWED_ORIGIN_REGEXES = values.ListValue([])
|
||||||
|
|
||||||
|
# Sentry
|
||||||
|
SENTRY_DSN = values.Value(None, environ_name="SENTRY_DSN")
|
||||||
|
|
||||||
|
# Easy thumbnails
|
||||||
|
THUMBNAIL_EXTENSION = "webp"
|
||||||
|
THUMBNAIL_TRANSPARENCY_EXTENSION = "webp"
|
||||||
|
THUMBNAIL_ALIASES = {}
|
||||||
|
|
||||||
|
# Celery
|
||||||
|
CELERY_BROKER_URL = values.Value("redis://redis:6379/0")
|
||||||
|
CELERY_BROKER_TRANSPORT_OPTIONS = values.DictValue({})
|
||||||
|
|
||||||
|
# Session
|
||||||
|
SESSION_ENGINE = "django.contrib.sessions.backends.cache"
|
||||||
|
SESSION_CACHE_ALIAS = "default"
|
||||||
|
SESSION_COOKIE_AGE = 60 * 60 * 12
|
||||||
|
|
||||||
|
# OIDC - Authorization Code Flow
|
||||||
|
OIDC_CREATE_USER = values.BooleanValue(
|
||||||
|
default=True,
|
||||||
|
environ_name="OIDC_CREATE_USER",
|
||||||
|
)
|
||||||
|
OIDC_RP_SIGN_ALGO = values.Value(
|
||||||
|
"RS256", environ_name="OIDC_RP_SIGN_ALGO", environ_prefix=None
|
||||||
|
)
|
||||||
|
OIDC_RP_CLIENT_ID = values.Value(
|
||||||
|
"impress", environ_name="OIDC_RP_CLIENT_ID", environ_prefix=None
|
||||||
|
)
|
||||||
|
OIDC_RP_CLIENT_SECRET = values.Value(
|
||||||
|
None,
|
||||||
|
environ_name="OIDC_RP_CLIENT_SECRET",
|
||||||
|
environ_prefix=None,
|
||||||
|
)
|
||||||
|
OIDC_OP_JWKS_ENDPOINT = values.Value(
|
||||||
|
environ_name="OIDC_OP_JWKS_ENDPOINT", environ_prefix=None
|
||||||
|
)
|
||||||
|
OIDC_OP_AUTHORIZATION_ENDPOINT = values.Value(
|
||||||
|
environ_name="OIDC_OP_AUTHORIZATION_ENDPOINT", environ_prefix=None
|
||||||
|
)
|
||||||
|
OIDC_OP_TOKEN_ENDPOINT = values.Value(
|
||||||
|
None, environ_name="OIDC_OP_TOKEN_ENDPOINT", environ_prefix=None
|
||||||
|
)
|
||||||
|
OIDC_OP_USER_ENDPOINT = values.Value(
|
||||||
|
None, environ_name="OIDC_OP_USER_ENDPOINT", environ_prefix=None
|
||||||
|
)
|
||||||
|
OIDC_OP_LOGOUT_ENDPOINT = values.Value(
|
||||||
|
None, environ_name="OIDC_OP_LOGOUT_ENDPOINT", environ_prefix=None
|
||||||
|
)
|
||||||
|
OIDC_AUTH_REQUEST_EXTRA_PARAMS = values.DictValue(
|
||||||
|
{}, environ_name="OIDC_AUTH_REQUEST_EXTRA_PARAMS", environ_prefix=None
|
||||||
|
)
|
||||||
|
OIDC_RP_SCOPES = values.Value(
|
||||||
|
"openid email", environ_name="OIDC_RP_SCOPES", environ_prefix=None
|
||||||
|
)
|
||||||
|
LOGIN_REDIRECT_URL = values.Value(
|
||||||
|
None, environ_name="LOGIN_REDIRECT_URL", environ_prefix=None
|
||||||
|
)
|
||||||
|
LOGIN_REDIRECT_URL_FAILURE = values.Value(
|
||||||
|
None, environ_name="LOGIN_REDIRECT_URL_FAILURE", environ_prefix=None
|
||||||
|
)
|
||||||
|
LOGOUT_REDIRECT_URL = values.Value(
|
||||||
|
None, environ_name="LOGOUT_REDIRECT_URL", environ_prefix=None
|
||||||
|
)
|
||||||
|
OIDC_USE_NONCE = values.BooleanValue(
|
||||||
|
default=True, environ_name="OIDC_USE_NONCE", environ_prefix=None
|
||||||
|
)
|
||||||
|
OIDC_REDIRECT_REQUIRE_HTTPS = values.BooleanValue(
|
||||||
|
default=False, environ_name="OIDC_REDIRECT_REQUIRE_HTTPS", environ_prefix=None
|
||||||
|
)
|
||||||
|
OIDC_REDIRECT_ALLOWED_HOSTS = values.ListValue(
|
||||||
|
default=[], environ_name="OIDC_REDIRECT_ALLOWED_HOSTS", environ_prefix=None
|
||||||
|
)
|
||||||
|
OIDC_STORE_ID_TOKEN = values.BooleanValue(
|
||||||
|
default=True, environ_name="OIDC_STORE_ID_TOKEN", environ_prefix=None
|
||||||
|
)
|
||||||
|
ALLOW_LOGOUT_GET_METHOD = values.BooleanValue(
|
||||||
|
default=True, environ_name="ALLOW_LOGOUT_GET_METHOD", environ_prefix=None
|
||||||
|
)
|
||||||
|
|
||||||
|
# pylint: disable=invalid-name
|
||||||
|
@property
|
||||||
|
def ENVIRONMENT(self):
|
||||||
|
"""Environment in which the application is launched."""
|
||||||
|
return self.__class__.__name__.lower()
|
||||||
|
|
||||||
|
# pylint: disable=invalid-name
|
||||||
|
@property
|
||||||
|
def RELEASE(self):
|
||||||
|
"""
|
||||||
|
Return the release information.
|
||||||
|
|
||||||
|
Delegate to the module function to enable easier testing.
|
||||||
|
"""
|
||||||
|
return get_release()
|
||||||
|
|
||||||
|
# pylint: disable=invalid-name
|
||||||
|
@property
|
||||||
|
def PARLER_LANGUAGES(self):
|
||||||
|
"""
|
||||||
|
Return languages for Parler computed from the LANGUAGES and LANGUAGE_CODE settings.
|
||||||
|
"""
|
||||||
|
return {
|
||||||
|
self.SITE_ID: tuple({"code": code} for code, _name in self.LANGUAGES),
|
||||||
|
"default": {
|
||||||
|
"fallbacks": [self.LANGUAGE_CODE],
|
||||||
|
"hide_untranslated": False,
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def post_setup(cls):
|
||||||
|
"""Post setup configuration.
|
||||||
|
This is the place where you can configure settings that require other
|
||||||
|
settings to be loaded.
|
||||||
|
"""
|
||||||
|
super().post_setup()
|
||||||
|
|
||||||
|
# The SENTRY_DSN setting should be available to activate sentry for an environment
|
||||||
|
if cls.SENTRY_DSN is not None:
|
||||||
|
sentry_sdk.init(
|
||||||
|
dsn=cls.SENTRY_DSN,
|
||||||
|
environment=cls.__name__.lower(),
|
||||||
|
release=get_release(),
|
||||||
|
integrations=[DjangoIntegration()],
|
||||||
|
)
|
||||||
|
with sentry_sdk.configure_scope() as scope:
|
||||||
|
scope.set_extra("application", "backend")
|
||||||
|
|
||||||
|
|
||||||
|
class Build(Base):
|
||||||
|
"""Settings used when the application is built.
|
||||||
|
|
||||||
|
This environment should not be used to run the application. Just to build it with non-blocking
|
||||||
|
settings.
|
||||||
|
"""
|
||||||
|
|
||||||
|
SECRET_KEY = values.Value("DummyKey")
|
||||||
|
STORAGES = {
|
||||||
|
"default": {
|
||||||
|
"BACKEND": "django.core.files.storage.FileSystemStorage",
|
||||||
|
},
|
||||||
|
"staticfiles": {
|
||||||
|
"BACKEND": values.Value(
|
||||||
|
"whitenoise.storage.CompressedManifestStaticFilesStorage",
|
||||||
|
environ_name="STORAGES_STATICFILES_BACKEND",
|
||||||
|
),
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
class Development(Base):
|
||||||
|
"""
|
||||||
|
Development environment settings
|
||||||
|
|
||||||
|
We set DEBUG to True and configure the server to respond from all hosts.
|
||||||
|
"""
|
||||||
|
|
||||||
|
ALLOWED_HOSTS = ["*"]
|
||||||
|
CORS_ALLOW_ALL_ORIGINS = True
|
||||||
|
CSRF_TRUSTED_ORIGINS = ["http://localhost:8072", "http://localhost:3000"]
|
||||||
|
DEBUG = True
|
||||||
|
|
||||||
|
SESSION_COOKIE_NAME = "impress_sessionid"
|
||||||
|
|
||||||
|
USE_SWAGGER = True
|
||||||
|
|
||||||
|
def __init__(self):
|
||||||
|
# pylint: disable=invalid-name
|
||||||
|
self.INSTALLED_APPS += ["django_extensions", "drf_spectacular_sidecar"]
|
||||||
|
|
||||||
|
|
||||||
|
class Test(Base):
|
||||||
|
"""Test environment settings"""
|
||||||
|
|
||||||
|
LOGGING = values.DictValue(
|
||||||
|
{
|
||||||
|
"version": 1,
|
||||||
|
"disable_existing_loggers": False,
|
||||||
|
"handlers": {
|
||||||
|
"console": {
|
||||||
|
"class": "logging.StreamHandler",
|
||||||
|
},
|
||||||
|
},
|
||||||
|
"loggers": {
|
||||||
|
"impress": {
|
||||||
|
"handlers": ["console"],
|
||||||
|
"level": "DEBUG",
|
||||||
|
},
|
||||||
|
},
|
||||||
|
}
|
||||||
|
)
|
||||||
|
PASSWORD_HASHERS = [
|
||||||
|
"django.contrib.auth.hashers.MD5PasswordHasher",
|
||||||
|
]
|
||||||
|
USE_SWAGGER = True
|
||||||
|
|
||||||
|
CELERY_TASK_ALWAYS_EAGER = values.BooleanValue(True)
|
||||||
|
|
||||||
|
def __init__(self):
|
||||||
|
# pylint: disable=invalid-name
|
||||||
|
self.INSTALLED_APPS += ["drf_spectacular_sidecar"]
|
||||||
|
|
||||||
|
|
||||||
|
class ContinuousIntegration(Test):
|
||||||
|
"""
|
||||||
|
Continuous Integration environment settings
|
||||||
|
|
||||||
|
nota bene: it should inherit from the Test environment.
|
||||||
|
"""
|
||||||
|
|
||||||
|
|
||||||
|
class Production(Base):
|
||||||
|
"""
|
||||||
|
Production environment settings
|
||||||
|
|
||||||
|
You must define the ALLOWED_HOSTS environment variable in Production
|
||||||
|
configuration (and derived configurations):
|
||||||
|
ALLOWED_HOSTS=["foo.com", "foo.fr"]
|
||||||
|
"""
|
||||||
|
|
||||||
|
# Security
|
||||||
|
ALLOWED_HOSTS = values.ListValue(None)
|
||||||
|
CSRF_TRUSTED_ORIGINS = values.ListValue([])
|
||||||
|
SECURE_BROWSER_XSS_FILTER = True
|
||||||
|
SECURE_CONTENT_TYPE_NOSNIFF = True
|
||||||
|
|
||||||
|
# SECURE_PROXY_SSL_HEADER allows to fix the scheme in Django's HttpRequest
|
||||||
|
# object when your application is behind a reverse proxy.
|
||||||
|
#
|
||||||
|
# Keep this SECURE_PROXY_SSL_HEADER configuration only if :
|
||||||
|
# - your Django app is behind a proxy.
|
||||||
|
# - your proxy strips the X-Forwarded-Proto header from all incoming requests
|
||||||
|
# - Your proxy sets the X-Forwarded-Proto header and sends it to Django
|
||||||
|
#
|
||||||
|
# In other cases, you should comment the following line to avoid security issues.
|
||||||
|
# SECURE_PROXY_SSL_HEADER = ("HTTP_X_FORWARDED_PROTO", "https")
|
||||||
|
SECURE_PROXY_SSL_HEADER = ("HTTP_X_FORWARDED_PROTO", "https")
|
||||||
|
|
||||||
|
# Modern browsers require to have the `secure` attribute on cookies with `Samesite=none`
|
||||||
|
CSRF_COOKIE_SECURE = True
|
||||||
|
SESSION_COOKIE_SECURE = True
|
||||||
|
|
||||||
|
# Privacy
|
||||||
|
SECURE_REFERRER_POLICY = "same-origin"
|
||||||
|
|
||||||
|
CACHES = {
|
||||||
|
"default": {
|
||||||
|
"BACKEND": "django_redis.cache.RedisCache",
|
||||||
|
"LOCATION": values.Value(
|
||||||
|
"redis://redis:6379/1",
|
||||||
|
environ_name="REDIS_URL",
|
||||||
|
environ_prefix=None,
|
||||||
|
),
|
||||||
|
"OPTIONS": {
|
||||||
|
"CLIENT_CLASS": "django_redis.client.DefaultClient",
|
||||||
|
},
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
class Feature(Production):
|
||||||
|
"""
|
||||||
|
Feature environment settings
|
||||||
|
|
||||||
|
nota bene: it should inherit from the Production environment.
|
||||||
|
"""
|
||||||
|
|
||||||
|
|
||||||
|
class Staging(Production):
|
||||||
|
"""
|
||||||
|
Staging environment settings
|
||||||
|
|
||||||
|
nota bene: it should inherit from the Production environment.
|
||||||
|
"""
|
||||||
|
|
||||||
|
|
||||||
|
class PreProduction(Production):
|
||||||
|
"""
|
||||||
|
Pre-production environment settings
|
||||||
|
|
||||||
|
nota bene: it should inherit from the Production environment.
|
||||||
|
"""
|
||||||
|
|
||||||
|
|
||||||
|
class Demo(Production):
|
||||||
|
"""
|
||||||
|
Demonstration environment settings
|
||||||
|
|
||||||
|
nota bene: it should inherit from the Production environment.
|
||||||
|
"""
|
||||||
|
|
||||||
|
STORAGES = {
|
||||||
|
"default": {
|
||||||
|
"BACKEND": "django.core.files.storage.FileSystemStorage",
|
||||||
|
},
|
||||||
|
"staticfiles": {
|
||||||
|
"BACKEND": "django.contrib.staticfiles.storage.StaticFilesStorage",
|
||||||
|
},
|
||||||
|
}
|
||||||
48
src/backend/impress/urls.py
Normal file
48
src/backend/impress/urls.py
Normal file
@@ -0,0 +1,48 @@
|
|||||||
|
"""URL configuration for the impress project"""
|
||||||
|
|
||||||
|
from django.conf import settings
|
||||||
|
from django.conf.urls.static import static
|
||||||
|
from django.contrib import admin
|
||||||
|
from django.contrib.staticfiles.urls import staticfiles_urlpatterns
|
||||||
|
from django.urls import include, path, re_path
|
||||||
|
|
||||||
|
from drf_spectacular.views import (
|
||||||
|
SpectacularJSONAPIView,
|
||||||
|
SpectacularRedocView,
|
||||||
|
SpectacularSwaggerView,
|
||||||
|
)
|
||||||
|
|
||||||
|
urlpatterns = [
|
||||||
|
path("admin/", admin.site.urls),
|
||||||
|
path("", include("core.urls")),
|
||||||
|
]
|
||||||
|
|
||||||
|
if settings.DEBUG:
|
||||||
|
urlpatterns = (
|
||||||
|
urlpatterns
|
||||||
|
+ staticfiles_urlpatterns()
|
||||||
|
+ static(settings.MEDIA_URL, document_root=settings.MEDIA_ROOT)
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
if settings.USE_SWAGGER or settings.DEBUG:
|
||||||
|
urlpatterns += [
|
||||||
|
path(
|
||||||
|
f"{settings.API_VERSION}/swagger.json",
|
||||||
|
SpectacularJSONAPIView.as_view(
|
||||||
|
api_version=settings.API_VERSION,
|
||||||
|
urlconf="core.urls",
|
||||||
|
),
|
||||||
|
name="client-api-schema",
|
||||||
|
),
|
||||||
|
path(
|
||||||
|
f"{settings.API_VERSION}//swagger/",
|
||||||
|
SpectacularSwaggerView.as_view(url_name="client-api-schema"),
|
||||||
|
name="swagger-ui-schema",
|
||||||
|
),
|
||||||
|
re_path(
|
||||||
|
f"{settings.API_VERSION}//redoc/",
|
||||||
|
SpectacularRedocView.as_view(url_name="client-api-schema"),
|
||||||
|
name="redoc-schema",
|
||||||
|
),
|
||||||
|
]
|
||||||
17
src/backend/impress/wsgi.py
Normal file
17
src/backend/impress/wsgi.py
Normal file
@@ -0,0 +1,17 @@
|
|||||||
|
"""
|
||||||
|
WSGI config for the impress project.
|
||||||
|
|
||||||
|
It exposes the WSGI callable as a module-level variable named ``application``.
|
||||||
|
|
||||||
|
For more information on this file, see
|
||||||
|
https://docs.djangoproject.com/en/3.1/howto/deployment/wsgi/
|
||||||
|
"""
|
||||||
|
|
||||||
|
import os
|
||||||
|
|
||||||
|
from configurations.wsgi import get_wsgi_application
|
||||||
|
|
||||||
|
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "impress.settings")
|
||||||
|
os.environ.setdefault("DJANGO_CONFIGURATION", "Development")
|
||||||
|
|
||||||
|
application = get_wsgi_application()
|
||||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user